Dec 03 18:54:37 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 18:54:37 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:37 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:38 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 18:54:39 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 18:54:39 crc kubenswrapper[4731]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.656452 4731 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665021 4731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665087 4731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665100 4731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665109 4731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665124 4731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665140 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665151 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665162 4731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665171 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665181 4731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665191 4731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665199 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665208 4731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665216 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665224 4731 feature_gate.go:330] unrecognized feature gate: Example Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665232 4731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665241 4731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665249 4731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665285 4731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665298 4731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665309 4731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665320 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665332 4731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665342 4731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665350 4731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665359 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665370 4731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665379 4731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665389 4731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665398 4731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665407 4731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665416 4731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665425 4731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665434 4731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665443 4731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665452 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665460 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665469 4731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665478 4731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665487 4731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665495 4731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665504 4731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665512 4731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665521 4731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665529 4731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665537 4731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665546 4731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665554 4731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665565 4731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665575 4731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665585 4731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665594 4731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665604 4731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665615 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665626 4731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665639 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665650 4731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665660 4731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665670 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665681 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665691 4731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665700 4731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665709 4731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665718 4731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665725 4731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665734 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665743 4731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665754 4731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665768 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665777 4731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.665785 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.665985 4731 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666009 4731 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666029 4731 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666043 4731 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666057 4731 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666067 4731 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666080 4731 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666093 4731 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666102 4731 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666112 4731 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666124 4731 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666136 4731 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666146 4731 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666156 4731 flags.go:64] FLAG: --cgroup-root="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666166 4731 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666175 4731 flags.go:64] FLAG: --client-ca-file="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666184 4731 flags.go:64] FLAG: --cloud-config="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666193 4731 flags.go:64] FLAG: --cloud-provider="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666203 4731 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666215 4731 flags.go:64] FLAG: --cluster-domain="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666224 4731 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666233 4731 flags.go:64] FLAG: --config-dir="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666242 4731 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666281 4731 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666296 4731 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666305 4731 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666315 4731 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666325 4731 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666335 4731 flags.go:64] FLAG: --contention-profiling="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666345 4731 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666355 4731 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666364 4731 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666373 4731 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666387 4731 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666397 4731 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666406 4731 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666415 4731 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666426 4731 flags.go:64] FLAG: --enable-server="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666435 4731 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666452 4731 flags.go:64] FLAG: --event-burst="100" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666463 4731 flags.go:64] FLAG: --event-qps="50" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666474 4731 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666484 4731 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666494 4731 flags.go:64] FLAG: --eviction-hard="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666507 4731 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666516 4731 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666525 4731 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666536 4731 flags.go:64] FLAG: --eviction-soft="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666546 4731 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666555 4731 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666564 4731 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666574 4731 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666583 4731 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666592 4731 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666602 4731 flags.go:64] FLAG: --feature-gates="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666614 4731 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666624 4731 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666634 4731 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666644 4731 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666653 4731 flags.go:64] FLAG: --healthz-port="10248" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666663 4731 flags.go:64] FLAG: --help="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666672 4731 flags.go:64] FLAG: --hostname-override="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666681 4731 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666691 4731 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666700 4731 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666710 4731 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666718 4731 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666728 4731 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666737 4731 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666746 4731 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666755 4731 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666765 4731 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666775 4731 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666785 4731 flags.go:64] FLAG: --kube-reserved="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666795 4731 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666804 4731 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666814 4731 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666823 4731 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666832 4731 flags.go:64] FLAG: --lock-file="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666841 4731 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666851 4731 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666861 4731 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666878 4731 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666888 4731 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666898 4731 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666907 4731 flags.go:64] FLAG: --logging-format="text" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666918 4731 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666931 4731 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666943 4731 flags.go:64] FLAG: --manifest-url="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666956 4731 flags.go:64] FLAG: --manifest-url-header="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666972 4731 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666983 4731 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.666997 4731 flags.go:64] FLAG: --max-pods="110" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667007 4731 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667016 4731 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667026 4731 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667036 4731 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667045 4731 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667055 4731 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667098 4731 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667124 4731 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667134 4731 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667143 4731 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667153 4731 flags.go:64] FLAG: --pod-cidr="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667161 4731 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667175 4731 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667189 4731 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667199 4731 flags.go:64] FLAG: --pods-per-core="0" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667209 4731 flags.go:64] FLAG: --port="10250" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667219 4731 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667229 4731 flags.go:64] FLAG: --provider-id="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667239 4731 flags.go:64] FLAG: --qos-reserved="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667248 4731 flags.go:64] FLAG: --read-only-port="10255" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667282 4731 flags.go:64] FLAG: --register-node="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667291 4731 flags.go:64] FLAG: --register-schedulable="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667300 4731 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667318 4731 flags.go:64] FLAG: --registry-burst="10" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667327 4731 flags.go:64] FLAG: --registry-qps="5" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667336 4731 flags.go:64] FLAG: --reserved-cpus="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667348 4731 flags.go:64] FLAG: --reserved-memory="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667360 4731 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667370 4731 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667380 4731 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667389 4731 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667398 4731 flags.go:64] FLAG: --runonce="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667407 4731 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667417 4731 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667427 4731 flags.go:64] FLAG: --seccomp-default="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667436 4731 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667445 4731 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667454 4731 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667464 4731 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667475 4731 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667484 4731 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667493 4731 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667502 4731 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667511 4731 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667521 4731 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667546 4731 flags.go:64] FLAG: --system-cgroups="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667556 4731 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667577 4731 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667589 4731 flags.go:64] FLAG: --tls-cert-file="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667601 4731 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667617 4731 flags.go:64] FLAG: --tls-min-version="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667629 4731 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667640 4731 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667651 4731 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667663 4731 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667676 4731 flags.go:64] FLAG: --v="2" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667692 4731 flags.go:64] FLAG: --version="false" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667708 4731 flags.go:64] FLAG: --vmodule="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667723 4731 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.667735 4731 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668015 4731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668030 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668043 4731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668055 4731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668065 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668076 4731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668085 4731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668095 4731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668104 4731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668114 4731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668124 4731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668133 4731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668147 4731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668161 4731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668173 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668183 4731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668195 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668211 4731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668222 4731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668232 4731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668242 4731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668284 4731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668295 4731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668305 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668315 4731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668324 4731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668334 4731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668343 4731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668353 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668363 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668372 4731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668382 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668391 4731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668403 4731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668416 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668427 4731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668437 4731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668447 4731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668460 4731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668471 4731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668481 4731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668491 4731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668501 4731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668513 4731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668526 4731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668536 4731 feature_gate.go:330] unrecognized feature gate: Example Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668546 4731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668557 4731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668567 4731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668582 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668592 4731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668603 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668613 4731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668622 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668634 4731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668645 4731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668655 4731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668665 4731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668677 4731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668688 4731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668700 4731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668710 4731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668721 4731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668730 4731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668738 4731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668746 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668754 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668761 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668769 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668777 4731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.668785 4731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.668815 4731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.681112 4731 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.681176 4731 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681312 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681328 4731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681336 4731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681344 4731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681352 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681360 4731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681367 4731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681374 4731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681381 4731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681388 4731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681395 4731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681404 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681412 4731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681421 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681429 4731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681436 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681442 4731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681449 4731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681455 4731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681463 4731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681469 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681475 4731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681481 4731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681488 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681494 4731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681500 4731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681507 4731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681513 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681521 4731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681531 4731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681538 4731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681545 4731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681554 4731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681563 4731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681571 4731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681578 4731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681584 4731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681594 4731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681602 4731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681610 4731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681618 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681626 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681634 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681641 4731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681649 4731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681656 4731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681665 4731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681672 4731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681679 4731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681687 4731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681693 4731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681700 4731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681706 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681713 4731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681719 4731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681726 4731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681733 4731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681739 4731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681745 4731 feature_gate.go:330] unrecognized feature gate: Example Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681752 4731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681758 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681765 4731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681774 4731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681782 4731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681789 4731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681801 4731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681811 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681819 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681827 4731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681835 4731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.681842 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.681854 4731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682102 4731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682114 4731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682124 4731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682134 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682142 4731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682151 4731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682157 4731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682164 4731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682170 4731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682177 4731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682184 4731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682191 4731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682198 4731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682205 4731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682212 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682218 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682224 4731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682231 4731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682237 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682245 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682276 4731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682284 4731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682290 4731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682297 4731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682304 4731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682312 4731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682318 4731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682323 4731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682329 4731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682344 4731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682350 4731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682356 4731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682361 4731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682410 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682418 4731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682425 4731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682432 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682438 4731 feature_gate.go:330] unrecognized feature gate: Example Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682444 4731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682451 4731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682457 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682463 4731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682468 4731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682474 4731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682479 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682485 4731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682491 4731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682497 4731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682502 4731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682508 4731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682513 4731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682518 4731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682523 4731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682529 4731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682534 4731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682539 4731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682546 4731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682553 4731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682559 4731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682566 4731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682571 4731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682577 4731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682583 4731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682588 4731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682596 4731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682603 4731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682609 4731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682614 4731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682619 4731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682625 4731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.682630 4731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.682641 4731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.682897 4731 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.687068 4731 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.687219 4731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.687966 4731 server.go:997] "Starting client certificate rotation" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.688020 4731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.688283 4731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 10:07:45.122010135 +0000 UTC Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.688415 4731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 231h13m5.433598782s for next certificate rotation Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.695800 4731 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.697808 4731 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.708514 4731 log.go:25] "Validated CRI v1 runtime API" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.736329 4731 log.go:25] "Validated CRI v1 image API" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.738942 4731 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.742613 4731 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-18-50-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.742665 4731 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.770227 4731 manager.go:217] Machine: {Timestamp:2025-12-03 18:54:39.768120328 +0000 UTC m=+0.366714832 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f0d786ec-d814-4b2d-8cec-fe62d92000dd BootID:0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:83:94:2e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:83:94:2e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:09:03:2a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c2:ef:4c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f5:60:10 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c5:ec:e4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:0c:89:62:ad:b1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:46:fe:f2:e0:07 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.770779 4731 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.771004 4731 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.772191 4731 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.772602 4731 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.772679 4731 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.773096 4731 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.773120 4731 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.773515 4731 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.773604 4731 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.774212 4731 state_mem.go:36] "Initialized new in-memory state store" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.774410 4731 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.775859 4731 kubelet.go:418] "Attempting to sync node with API server" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.775910 4731 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.775963 4731 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.776000 4731 kubelet.go:324] "Adding apiserver pod source" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.776027 4731 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.778877 4731 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.779567 4731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.779682 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.779788 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.779881 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.779907 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.780860 4731 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781752 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781804 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781822 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781838 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781861 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781887 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781902 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781927 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781945 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781960 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.781982 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.782020 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.782644 4731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.783617 4731 server.go:1280] "Started kubelet" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.784136 4731 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.784736 4731 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.784738 4731 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.785946 4731 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 18:54:39 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.787658 4731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dc96887573623 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 18:54:39.783523875 +0000 UTC m=+0.382118409,LastTimestamp:2025-12-03 18:54:39.783523875 +0000 UTC m=+0.382118409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.788340 4731 server.go:460] "Adding debug handlers to kubelet server" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.788822 4731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.788897 4731 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.790298 4731 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.790318 4731 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.790432 4731 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.791021 4731 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.791608 4731 factory.go:55] Registering systemd factory Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.791662 4731 factory.go:221] Registration of the systemd container factory successfully Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.791851 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="200ms" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.792112 4731 factory.go:153] Registering CRI-O factory Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.792154 4731 factory.go:221] Registration of the crio container factory successfully Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.792533 4731 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.792610 4731 factory.go:103] Registering Raw factory Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.792651 4731 manager.go:1196] Started watching for new ooms in manager Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.788944 4731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 11:04:49.461085431 +0000 UTC Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.793004 4731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 688h10m9.668089245s for next certificate rotation Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.794216 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.794312 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.795543 4731 manager.go:319] Starting recovery of all containers Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810324 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810384 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810400 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810412 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810427 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810442 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810456 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810469 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810485 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810496 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810511 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810527 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810540 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810581 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810597 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810609 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810628 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810640 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810653 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810665 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810676 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810689 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810701 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810712 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810726 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810742 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810760 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810775 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810789 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810802 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810816 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810856 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810893 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810906 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.810919 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812000 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812032 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812562 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812592 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812606 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812621 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.812634 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813366 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813432 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813500 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813654 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813744 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813777 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.813815 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814401 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814472 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814493 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814520 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814540 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814555 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814570 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814586 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814601 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814616 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814630 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814643 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814656 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814672 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814692 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814711 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814728 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814742 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814758 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814775 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814789 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814803 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814817 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814832 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814881 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814894 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814908 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814924 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814940 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814957 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814973 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.814986 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.815000 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.815974 4731 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816005 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816022 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816039 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816054 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816068 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816082 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816101 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816117 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816132 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816148 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816161 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816176 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816192 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816205 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816219 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816233 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816248 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816282 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816298 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816312 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816326 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816339 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816366 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816383 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816398 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816418 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816437 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816457 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816476 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816494 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816516 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816537 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816556 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816574 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816594 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816614 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816633 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816652 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816678 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816696 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816715 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816733 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816749 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816764 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816780 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816798 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816846 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816861 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816875 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816891 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816906 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816920 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816948 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816965 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816981 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.816997 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817011 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817026 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817040 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817055 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817079 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817093 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817108 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817123 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817136 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817150 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817164 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817180 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817195 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817210 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817226 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817242 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817296 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817310 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817324 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817344 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817358 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817372 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817386 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817402 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817417 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817431 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817445 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817460 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817478 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817493 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817509 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817525 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817540 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817554 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817568 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817583 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817597 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817612 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817626 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817640 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817656 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817672 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817692 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817711 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817728 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817742 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817762 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817781 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817801 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817820 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817840 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817859 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817882 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817903 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817919 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817934 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817948 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817963 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817977 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.817995 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818015 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818030 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818044 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818058 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818073 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818088 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818104 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818119 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818134 4731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818148 4731 reconstruct.go:97] "Volume reconstruction finished" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.818158 4731 reconciler.go:26] "Reconciler: start to sync state" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.834894 4731 manager.go:324] Recovery completed Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.848743 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.851399 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.851445 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.851459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.851793 4731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.853482 4731 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.853508 4731 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.853537 4731 state_mem.go:36] "Initialized new in-memory state store" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.854708 4731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.854775 4731 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.854811 4731 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.854882 4731 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 18:54:39 crc kubenswrapper[4731]: W1203 18:54:39.857993 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.858106 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.865571 4731 policy_none.go:49] "None policy: Start" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.866624 4731 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.866666 4731 state_mem.go:35] "Initializing new in-memory state store" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.871505 4731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dc96887573623 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 18:54:39.783523875 +0000 UTC m=+0.382118409,LastTimestamp:2025-12-03 18:54:39.783523875 +0000 UTC m=+0.382118409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.892167 4731 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.936776 4731 manager.go:334] "Starting Device Plugin manager" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.936963 4731 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.936989 4731 server.go:79] "Starting device plugin registration server" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.937574 4731 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.937642 4731 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.937803 4731 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.937920 4731 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.937930 4731 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.947480 4731 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.955763 4731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.955863 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957247 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957312 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957508 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957876 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.957988 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.958892 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.958966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.958990 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959284 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959412 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959450 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959537 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959584 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.959603 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960712 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960728 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960737 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960748 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960753 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960891 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.960992 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.961035 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.961977 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.961989 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962050 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962068 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962012 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962116 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962404 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962637 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.962687 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963582 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963612 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963622 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963720 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963744 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963824 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.963855 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.964687 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.964716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:39 crc kubenswrapper[4731]: I1203 18:54:39.964726 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:39 crc kubenswrapper[4731]: E1203 18:54:39.993892 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="400ms" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020571 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020638 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020693 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020736 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020828 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020874 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020918 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.020959 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021040 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021108 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021171 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021219 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021299 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021409 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.021478 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.038217 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.040000 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.040061 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.040082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.040133 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.040722 4731 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122623 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122741 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122788 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122821 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122856 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122887 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122920 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122917 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122950 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122987 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.122993 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123103 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123122 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123125 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123024 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123177 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123022 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123187 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123145 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123289 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123345 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123353 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123425 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123423 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123473 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123386 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123608 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123651 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123705 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.123826 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.241289 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.242806 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.242843 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.242855 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.242882 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.243438 4731 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.287204 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.298352 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.322647 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: W1203 18:54:40.331204 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-da392ea38cf3d76f9f550ef2dc2f44d22bd18166d324a748d7191e2ba6048b97 WatchSource:0}: Error finding container da392ea38cf3d76f9f550ef2dc2f44d22bd18166d324a748d7191e2ba6048b97: Status 404 returned error can't find the container with id da392ea38cf3d76f9f550ef2dc2f44d22bd18166d324a748d7191e2ba6048b97 Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.342242 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.354553 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:40 crc kubenswrapper[4731]: W1203 18:54:40.356268 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6a49a58ef2810e7bc68a0aca14ec1a28475fbdc736eb4f67acb62d72b48d8d4e WatchSource:0}: Error finding container 6a49a58ef2810e7bc68a0aca14ec1a28475fbdc736eb4f67acb62d72b48d8d4e: Status 404 returned error can't find the container with id 6a49a58ef2810e7bc68a0aca14ec1a28475fbdc736eb4f67acb62d72b48d8d4e Dec 03 18:54:40 crc kubenswrapper[4731]: W1203 18:54:40.370943 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-bed01d6a7028085c2dfd3e6614a520164749b17051be2cba9134823738451cf0 WatchSource:0}: Error finding container bed01d6a7028085c2dfd3e6614a520164749b17051be2cba9134823738451cf0: Status 404 returned error can't find the container with id bed01d6a7028085c2dfd3e6614a520164749b17051be2cba9134823738451cf0 Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.395186 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="800ms" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.644497 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: W1203 18:54:40.646026 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.646137 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.646762 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.646816 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.646830 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.646868 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.647531 4731 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.785187 4731 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:40 crc kubenswrapper[4731]: W1203 18:54:40.787078 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:40 crc kubenswrapper[4731]: E1203 18:54:40.787199 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.860867 4731 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605" exitCode=0 Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.860951 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.861055 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bed01d6a7028085c2dfd3e6614a520164749b17051be2cba9134823738451cf0"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.861160 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.862242 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.862328 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.862349 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.863381 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.863419 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2bb2a055057e48005f1ae46cd01caee1ab0db0368f29bc40bb0b17800faecabd"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.867029 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786" exitCode=0 Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.867091 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.867111 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a49a58ef2810e7bc68a0aca14ec1a28475fbdc736eb4f67acb62d72b48d8d4e"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.867203 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.868283 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.868357 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.868371 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.870030 4731 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe" exitCode=0 Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.870115 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.870149 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3939d633b04d7e31de087f16eacd44f162c5f4b733fbb928b0551600a8849d77"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.870283 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.871283 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.871309 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.871321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.872600 4731 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f48ed40702f2b3e8201aae59328a2348ccecd0fe6a60f5e12dceb64d047b8f7d" exitCode=0 Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.872635 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f48ed40702f2b3e8201aae59328a2348ccecd0fe6a60f5e12dceb64d047b8f7d"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.872664 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"da392ea38cf3d76f9f550ef2dc2f44d22bd18166d324a748d7191e2ba6048b97"} Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.872732 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.873510 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.873541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.873551 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.873761 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.874488 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.874551 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:40 crc kubenswrapper[4731]: I1203 18:54:40.874569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:41 crc kubenswrapper[4731]: W1203 18:54:41.051686 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:41 crc kubenswrapper[4731]: E1203 18:54:41.051797 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:41 crc kubenswrapper[4731]: W1203 18:54:41.056737 4731 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Dec 03 18:54:41 crc kubenswrapper[4731]: E1203 18:54:41.056792 4731 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Dec 03 18:54:41 crc kubenswrapper[4731]: E1203 18:54:41.195956 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="1.6s" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.448322 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.458654 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.458706 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.458716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.458746 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:54:41 crc kubenswrapper[4731]: E1203 18:54:41.459239 4731 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.878895 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.878963 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.878982 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.879134 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.880359 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.880753 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.880770 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.886758 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.886819 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.886838 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.886856 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.888028 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.888082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.888093 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.890826 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.890875 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.890895 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.890907 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.892713 4731 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef" exitCode=0 Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.892781 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.892920 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.893723 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.893748 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.893759 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.896035 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c1cf0ddc94deb2c3631d9f11159b1b18475b3c5620e44f2c378036d2a4fce9a1"} Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.896129 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.897604 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.897655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:41 crc kubenswrapper[4731]: I1203 18:54:41.897665 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.906641 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a"} Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.906789 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.908630 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.908691 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.908713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.910838 4731 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66" exitCode=0 Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.910894 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66"} Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.910987 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.911041 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912140 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912174 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912187 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912143 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912324 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:42 crc kubenswrapper[4731]: I1203 18:54:42.912389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.059435 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.060883 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.060927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.060943 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.060976 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921804 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05"} Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921882 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806"} Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921910 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5"} Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921930 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722"} Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921964 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.921998 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.923517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.923570 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:43 crc kubenswrapper[4731]: I1203 18:54:43.923588 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.041296 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.042822 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.045526 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.045627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.045685 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.050535 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.766103 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.932000 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f"} Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.932086 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.932199 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.932223 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.933608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.933678 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.933699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934068 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934093 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934141 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934153 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934110 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:44 crc kubenswrapper[4731]: I1203 18:54:44.934315 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.661763 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.941812 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.941861 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.941905 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.944405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.944461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.944484 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945495 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945555 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945570 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945852 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945921 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:45 crc kubenswrapper[4731]: I1203 18:54:45.945939 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.082708 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.921513 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.944940 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.945599 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946176 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946198 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946783 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946825 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:46 crc kubenswrapper[4731]: I1203 18:54:46.946838 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.235743 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.236369 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.237746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.237813 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.237827 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.524985 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.948002 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.949632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.949694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:47 crc kubenswrapper[4731]: I1203 18:54:47.949719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:48 crc kubenswrapper[4731]: I1203 18:54:48.333160 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:54:48 crc kubenswrapper[4731]: I1203 18:54:48.333459 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:48 crc kubenswrapper[4731]: I1203 18:54:48.334833 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:48 crc kubenswrapper[4731]: I1203 18:54:48.334967 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:48 crc kubenswrapper[4731]: I1203 18:54:48.334996 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:49 crc kubenswrapper[4731]: E1203 18:54:49.947810 4731 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.090935 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.091178 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.092537 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.092617 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.092632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.098963 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.786835 4731 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.961694 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.962829 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.962889 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:51 crc kubenswrapper[4731]: I1203 18:54:51.962908 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.286157 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.286235 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 18:54:52 crc kubenswrapper[4731]: E1203 18:54:52.797886 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.929851 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.929970 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.938220 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 18:54:52 crc kubenswrapper[4731]: I1203 18:54:52.938362 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 18:54:54 crc kubenswrapper[4731]: I1203 18:54:54.091590 4731 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 18:54:54 crc kubenswrapper[4731]: I1203 18:54:54.091690 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.668914 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.669382 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.671341 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.671444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.671475 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.676077 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.971980 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.972970 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.973034 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:55 crc kubenswrapper[4731]: I1203 18:54:55.973050 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.554700 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.554925 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.556320 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.556393 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.556405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.574659 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.926581 4731 trace.go:236] Trace[449612240]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 18:54:43.145) (total time: 14780ms): Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[449612240]: ---"Objects listed" error: 14780ms (18:54:57.926) Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[449612240]: [14.780983949s] [14.780983949s] END Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.926622 4731 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.926759 4731 trace.go:236] Trace[860398277]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 18:54:43.255) (total time: 14671ms): Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[860398277]: ---"Objects listed" error: 14671ms (18:54:57.926) Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[860398277]: [14.671204744s] [14.671204744s] END Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.926774 4731 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.927078 4731 trace.go:236] Trace[1582295501]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 18:54:43.673) (total time: 14253ms): Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[1582295501]: ---"Objects listed" error: 14253ms (18:54:57.926) Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[1582295501]: [14.253876043s] [14.253876043s] END Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.927100 4731 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.928845 4731 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.929067 4731 trace.go:236] Trace[216083759]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 18:54:42.932) (total time: 14996ms): Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[216083759]: ---"Objects listed" error: 14996ms (18:54:57.928) Dec 03 18:54:57 crc kubenswrapper[4731]: Trace[216083759]: [14.996412193s] [14.996412193s] END Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.929099 4731 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 18:54:57 crc kubenswrapper[4731]: E1203 18:54:57.929432 4731 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.961780 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41474->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.961841 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41458->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.961889 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41474->192.168.126.11:17697: read: connection reset by peer" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.961924 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41458->192.168.126.11:17697: read: connection reset by peer" Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.962465 4731 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 18:54:57 crc kubenswrapper[4731]: I1203 18:54:57.962563 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.788464 4731 apiserver.go:52] "Watching apiserver" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.793593 4731 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.794214 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.794798 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.794897 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.794914 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.795541 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.795621 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.795782 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.796060 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.796283 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.796358 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.798691 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.798795 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.798911 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.798711 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.802310 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.802691 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.802913 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.802991 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.803233 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.821482 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.845746 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.860663 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.871645 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.884600 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.892899 4731 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.894899 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.910589 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.922416 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934745 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934802 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934830 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934857 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934880 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934901 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934922 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934942 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934962 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.934991 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935021 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935453 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935568 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935606 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935662 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935744 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935780 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935815 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935833 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935855 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935849 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935886 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935907 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935926 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935955 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.935976 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936064 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936085 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936103 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936205 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936230 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936290 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936310 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936336 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936361 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936379 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936398 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936427 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936444 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936461 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936484 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936509 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936528 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936582 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936610 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936627 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936643 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936661 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936679 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936695 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936722 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936741 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936794 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936816 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936834 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936856 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936878 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936894 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937226 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937245 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937281 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937300 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937321 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938678 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938731 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938753 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938778 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938816 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938914 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939008 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939049 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939068 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939093 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939334 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939383 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939410 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939441 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939467 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939497 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939528 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936027 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.936403 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937611 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937707 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937807 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.937816 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938090 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938363 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938485 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938582 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938612 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938684 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938715 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.938919 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939061 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939006 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939177 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939196 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939276 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939668 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.939751 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940087 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940102 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940114 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940192 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940245 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940692 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940713 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940756 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940779 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941277 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.940801 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941351 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941380 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941404 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941431 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941455 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941610 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941804 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941476 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941876 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941896 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941918 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941939 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941942 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941962 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.941986 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942009 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942031 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942050 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942073 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942096 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942117 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942150 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942174 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942412 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942433 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942458 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942479 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942502 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942523 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942546 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942569 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942591 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942615 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942640 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942662 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.943976 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944002 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944022 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944041 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944060 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944082 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944138 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944166 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944187 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944209 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944231 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944248 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944286 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944304 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944323 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944344 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944363 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944386 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944410 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944428 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944452 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944473 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944493 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944524 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944545 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944566 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944590 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944613 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945061 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945116 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945137 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945158 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945180 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945205 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945226 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945247 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945290 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945312 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945335 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945357 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945379 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945398 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945419 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945439 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945460 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945480 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945500 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945520 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945542 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945566 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945588 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945609 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945631 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945651 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945737 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945758 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945784 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945809 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945828 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945847 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945865 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945882 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945900 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945993 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946013 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946034 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946052 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946074 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946096 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946117 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946140 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946160 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946182 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946201 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946221 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946240 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946273 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946292 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946311 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946332 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946351 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946369 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946420 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946448 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946467 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946487 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946514 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946564 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946583 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948406 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948437 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948456 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948501 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948528 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948626 4731 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948837 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948852 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948865 4731 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948876 4731 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948887 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948898 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948910 4731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948920 4731 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948930 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948941 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948951 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948962 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948973 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949180 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949196 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949208 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949218 4731 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949233 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949244 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949271 4731 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949375 4731 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949386 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949398 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949411 4731 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949422 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949433 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949443 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949453 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949463 4731 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949474 4731 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949486 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949496 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949506 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942492 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942565 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.942874 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.943643 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.943897 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944137 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944217 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944724 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.944925 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945046 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.945501 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946097 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946147 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946172 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946315 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946361 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.946616 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947045 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947059 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947264 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947424 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.956761 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947625 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947663 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947719 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947737 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.947946 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948000 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948201 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948322 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948468 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.948460 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949098 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.964476 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.964802 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.968866 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949325 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949393 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960324 4731 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949718 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949673 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949811 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950041 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.949968 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950177 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950587 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950646 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950711 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.950813 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951010 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951180 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951297 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951446 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951642 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951888 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951928 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951997 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.951990 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.952087 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.952419 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.952434 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.952923 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.952950 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953182 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953326 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953514 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953640 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953665 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953838 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.953884 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.954096 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.954105 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.956315 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957366 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957451 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957440 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957825 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957958 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.958039 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.958236 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.958449 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.958731 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.958762 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.959029 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.959053 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.959226 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960384 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960670 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.957611 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960885 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960957 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.960974 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.961719 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.962199 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.962799 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.962862 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.962901 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.969905 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.970463 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.976933 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.970549 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:54:59.470496752 +0000 UTC m=+20.069091256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.976961 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.970914 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.970959 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.971003 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.972006 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.972016 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977188 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:54:59.477155508 +0000 UTC m=+20.075749972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.972247 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.972783 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.973029 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.973434 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.973612 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.973633 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977377 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:54:59.477363884 +0000 UTC m=+20.075958348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.974131 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.974151 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.974272 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.974363 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.974385 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977462 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977471 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:54:59.477434417 +0000 UTC m=+20.076028881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977483 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:58 crc kubenswrapper[4731]: E1203 18:54:58.977530 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:54:59.477516739 +0000 UTC m=+20.076111203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.985784 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.987439 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.989282 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a" exitCode=255 Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.989410 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a"} Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.989567 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.993967 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:58 crc kubenswrapper[4731]: I1203 18:54:58.996048 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.001316 4731 scope.go:117] "RemoveContainer" containerID="7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.001332 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.003012 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.005116 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.009376 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.009621 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.010315 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.010450 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.011716 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012027 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012046 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012293 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012544 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012648 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012689 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.012875 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.013093 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.013146 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.013403 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.015691 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.015800 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.015855 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.015932 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.015987 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.016232 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.018859 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.019235 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.019303 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.019393 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.019480 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.019929 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.021322 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.022033 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.022246 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.026441 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.026800 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.027737 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.028132 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.028233 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.029583 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.029607 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.030106 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.030415 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.030090 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.047163 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051137 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051414 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051495 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051512 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051529 4731 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051567 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051597 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051613 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051695 4731 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051740 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051772 4731 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051785 4731 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051829 4731 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051874 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051887 4731 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051952 4731 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051969 4731 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051986 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.051999 4731 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052013 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052026 4731 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052039 4731 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052052 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052068 4731 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052108 4731 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052123 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052136 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052407 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052429 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052448 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052465 4731 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052482 4731 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052495 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052508 4731 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052523 4731 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052514 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052536 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052642 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052659 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052589 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052675 4731 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052729 4731 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052744 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052758 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052770 4731 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052788 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052798 4731 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052806 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052816 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052826 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052838 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052847 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052855 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052865 4731 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052874 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052883 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052892 4731 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052901 4731 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052912 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052925 4731 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052937 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052949 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052961 4731 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052970 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.052978 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053017 4731 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053051 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053065 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053236 4731 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053249 4731 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053276 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053290 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053302 4731 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053313 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053325 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053337 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053347 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053358 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053367 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053378 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053388 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053398 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053408 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053419 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053430 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053441 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053453 4731 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053464 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053474 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053509 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053519 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053530 4731 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053540 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053550 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053559 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053569 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053580 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053590 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053600 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053609 4731 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053622 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053633 4731 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053643 4731 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053654 4731 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053665 4731 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053673 4731 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053682 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053692 4731 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053700 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053709 4731 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053718 4731 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053726 4731 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053735 4731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053743 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053752 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053769 4731 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053780 4731 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053789 4731 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053798 4731 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053808 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053817 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053826 4731 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053835 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053845 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053853 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053864 4731 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053873 4731 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053883 4731 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053892 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053904 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053913 4731 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053923 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053932 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053941 4731 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053951 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053960 4731 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053969 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053978 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053987 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.053995 4731 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054004 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054013 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054021 4731 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054032 4731 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054041 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054052 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054061 4731 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054072 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054081 4731 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.054090 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.063534 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.076139 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.089173 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.096539 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.096918 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.096953 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.097445 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.097680 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.099296 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.101270 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.101334 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.102695 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.103340 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.110230 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.110267 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.115696 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.118707 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.124925 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.126724 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.128594 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.140342 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154779 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154821 4731 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154838 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154853 4731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154866 4731 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154881 4731 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154895 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154910 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154924 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154952 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154968 4731 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154982 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.154997 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.155011 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:54:59 crc kubenswrapper[4731]: W1203 18:54:59.224848 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-b03e1dda70e7b50e3f5e0353c082b3e2abfe0076a62c718b390bf90059946c61 WatchSource:0}: Error finding container b03e1dda70e7b50e3f5e0353c082b3e2abfe0076a62c718b390bf90059946c61: Status 404 returned error can't find the container with id b03e1dda70e7b50e3f5e0353c082b3e2abfe0076a62c718b390bf90059946c61 Dec 03 18:54:59 crc kubenswrapper[4731]: W1203 18:54:59.226658 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-baccf4d74a2cbe24260901ac6eb7fddf438eb52fc5b5afeadb7a7efd79302515 WatchSource:0}: Error finding container baccf4d74a2cbe24260901ac6eb7fddf438eb52fc5b5afeadb7a7efd79302515: Status 404 returned error can't find the container with id baccf4d74a2cbe24260901ac6eb7fddf438eb52fc5b5afeadb7a7efd79302515 Dec 03 18:54:59 crc kubenswrapper[4731]: W1203 18:54:59.227468 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7344db48b2527791ed59c5d249c2f0ec49ba69c7255455bfeff4510b8086e4f8 WatchSource:0}: Error finding container 7344db48b2527791ed59c5d249c2f0ec49ba69c7255455bfeff4510b8086e4f8: Status 404 returned error can't find the container with id 7344db48b2527791ed59c5d249c2f0ec49ba69c7255455bfeff4510b8086e4f8 Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.558947 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.559040 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.559068 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.559089 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.559107 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559176 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559214 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:55:00.559165112 +0000 UTC m=+21.157759576 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559467 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:00.559452661 +0000 UTC m=+21.158047345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559551 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559584 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559604 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559614 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559630 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559672 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:00.559655528 +0000 UTC m=+21.158250172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559697 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559721 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559701 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:00.559690409 +0000 UTC m=+21.158285103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:54:59 crc kubenswrapper[4731]: E1203 18:54:59.559834 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:00.559813953 +0000 UTC m=+21.158408437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.859863 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.860811 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.862847 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.864619 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.867458 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.868507 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.869783 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.871828 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.873108 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.875138 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.876299 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.878726 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.879927 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.887366 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.888476 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.889702 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:54:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.890642 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.891832 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.892720 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.894974 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.896228 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.897565 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.898220 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.898729 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.900016 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.900608 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.902000 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.902719 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.906293 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.907289 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.908412 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.908969 4731 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.909083 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.909059 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:54:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.911883 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.913452 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.913924 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.915679 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.916865 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.917562 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.918712 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.919468 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.920530 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.921133 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.922128 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.922794 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.923672 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.924216 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.925091 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.928021 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.928551 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.929091 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.929620 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.930307 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.930882 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.932598 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.942770 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:54:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.971683 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:54:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:54:59 crc kubenswrapper[4731]: I1203 18:54:59.987095 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:54:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.000307 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.000378 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b03e1dda70e7b50e3f5e0353c082b3e2abfe0076a62c718b390bf90059946c61"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.004289 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.005753 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.008875 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.009646 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.012123 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.012171 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.012184 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7344db48b2527791ed59c5d249c2f0ec49ba69c7255455bfeff4510b8086e4f8"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.013537 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"baccf4d74a2cbe24260901ac6eb7fddf438eb52fc5b5afeadb7a7efd79302515"} Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.026058 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.055499 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.076065 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.106728 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.124861 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.143680 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.167971 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.183427 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.198324 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.217320 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.569416 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.569500 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.569528 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.569549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.569571 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569683 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569717 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569729 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569719 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:55:02.569642471 +0000 UTC m=+23.168236975 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569833 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:02.569809986 +0000 UTC m=+23.168404480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569873 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569920 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569939 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569956 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.569784 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.570020 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:02.569991142 +0000 UTC m=+23.168585616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.570049 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:02.570036593 +0000 UTC m=+23.168631077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.570067 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:02.570058404 +0000 UTC m=+23.168652878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.855340 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.855485 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.855577 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.855807 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:00 crc kubenswrapper[4731]: I1203 18:55:00.855848 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:00 crc kubenswrapper[4731]: E1203 18:55:00.855937 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.096864 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.103178 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.108561 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.113399 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.128723 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.129737 4731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.132116 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.132160 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.132173 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.132270 4731 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.144682 4731 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.145184 4731 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.146979 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.147026 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.147038 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.147060 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.147073 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.158824 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.212781 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.213096 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.219928 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.219972 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.219983 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.220000 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.220011 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.236061 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.236044 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.241927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.241964 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.242003 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.242020 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.242033 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.255507 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.266545 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.271645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.271677 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.271684 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.271698 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.271709 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.281637 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.284872 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.290603 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.290629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.290638 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.290652 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.290663 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.296229 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.303988 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: E1203 18:55:01.304104 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.305983 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.306074 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.306086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.306100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.306110 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.311909 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.326447 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.339150 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.352369 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.365413 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.381902 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.392584 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.403097 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.410011 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.410057 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.410067 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.410082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.410091 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.413935 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:01Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.512598 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.512670 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.512690 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.512720 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.512740 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.616388 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.616462 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.616481 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.616514 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.616535 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.719112 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.719181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.719193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.719212 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.719224 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.822413 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.822488 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.822508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.822539 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.822566 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.926084 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.926161 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.926180 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.926210 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:01 crc kubenswrapper[4731]: I1203 18:55:01.926228 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:01Z","lastTransitionTime":"2025-12-03T18:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.030406 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.030463 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.030475 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.030493 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.030506 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.132679 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.132731 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.132744 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.132761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.132773 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.235610 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.235656 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.235669 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.235685 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.235698 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.338364 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.338436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.338450 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.338471 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.338483 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.441826 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.441886 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.441896 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.441919 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.441935 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.546738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.546854 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.546878 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.547614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.547805 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.590334 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.590559 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590600 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:55:06.590551678 +0000 UTC m=+27.189146142 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.590673 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.590738 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.590778 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590863 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590886 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590905 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590916 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590936 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590955 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590998 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.591006 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.590936 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:06.59092889 +0000 UTC m=+27.189523584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.591089 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:06.591054504 +0000 UTC m=+27.189649068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.591163 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:06.591145917 +0000 UTC m=+27.189740551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.591195 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:06.591177838 +0000 UTC m=+27.189772522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.650896 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.650984 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.651007 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.651037 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.651058 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.753633 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.753675 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.753683 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.753699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.753708 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.855106 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.855112 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.855147 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.855931 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.855734 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:02 crc kubenswrapper[4731]: E1203 18:55:02.856007 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.856740 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.856843 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.856870 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.856895 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.856916 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.959965 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.960046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.960072 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.960107 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:02 crc kubenswrapper[4731]: I1203 18:55:02.960130 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:02Z","lastTransitionTime":"2025-12-03T18:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.024333 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.041099 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.055159 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.062074 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.062237 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.062373 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.062460 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.062533 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.071861 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.090221 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.115336 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.132912 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.147951 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166055 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166065 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166085 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166099 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.166312 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.179809 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:03Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.268828 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.268901 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.268920 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.268957 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.268975 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.373096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.373181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.373204 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.373235 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.373285 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.476111 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.476163 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.476174 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.476191 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.476203 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.579700 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.579756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.579770 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.579788 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.579800 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.682164 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.682206 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.682216 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.682231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.682241 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.785058 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.785133 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.785147 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.785170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.785184 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.887817 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.887893 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.887908 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.887932 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.887947 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.990312 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.990386 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.990400 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.990426 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:03 crc kubenswrapper[4731]: I1203 18:55:03.990443 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:03Z","lastTransitionTime":"2025-12-03T18:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.095478 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.095524 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.095535 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.095553 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.095564 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.197763 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.197838 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.197850 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.197867 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.197879 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.300650 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.300701 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.300722 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.300740 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.300752 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.403611 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.403661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.403673 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.403694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.403708 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.506808 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.506861 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.506884 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.506907 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.506920 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.565609 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hzldf"] Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.566030 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.568821 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.569083 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.571060 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.593493 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.609358 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.609421 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.609433 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.609459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.609474 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.612397 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.626344 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.636904 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.656298 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.671749 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.687660 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.702616 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.709343 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhdtp\" (UniqueName: \"kubernetes.io/projected/b514e547-82a6-4f87-8879-aea6f4cd653d-kube-api-access-bhdtp\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.709535 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514e547-82a6-4f87-8879-aea6f4cd653d-hosts-file\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.711878 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.711922 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.711938 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.712243 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.712303 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.718178 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.733855 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.810236 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhdtp\" (UniqueName: \"kubernetes.io/projected/b514e547-82a6-4f87-8879-aea6f4cd653d-kube-api-access-bhdtp\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.810596 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514e547-82a6-4f87-8879-aea6f4cd653d-hosts-file\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.810721 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514e547-82a6-4f87-8879-aea6f4cd653d-hosts-file\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.815397 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.815429 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.815441 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.815460 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.815472 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.830382 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhdtp\" (UniqueName: \"kubernetes.io/projected/b514e547-82a6-4f87-8879-aea6f4cd653d-kube-api-access-bhdtp\") pod \"node-resolver-hzldf\" (UID: \"b514e547-82a6-4f87-8879-aea6f4cd653d\") " pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.855380 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.855470 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.855529 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:04 crc kubenswrapper[4731]: E1203 18:55:04.855534 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:04 crc kubenswrapper[4731]: E1203 18:55:04.855665 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:04 crc kubenswrapper[4731]: E1203 18:55:04.855762 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.880375 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzldf" Dec 03 18:55:04 crc kubenswrapper[4731]: W1203 18:55:04.892350 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb514e547_82a6_4f87_8879_aea6f4cd653d.slice/crio-e5e41a3b9b7d236b025f623c299075ca5fdc9f771278f4fcf232ac6ec3d53482 WatchSource:0}: Error finding container e5e41a3b9b7d236b025f623c299075ca5fdc9f771278f4fcf232ac6ec3d53482: Status 404 returned error can't find the container with id e5e41a3b9b7d236b025f623c299075ca5fdc9f771278f4fcf232ac6ec3d53482 Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.917335 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.917390 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.917403 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.917423 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.917438 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:04Z","lastTransitionTime":"2025-12-03T18:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.969163 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mmjcd"] Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.969758 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.970099 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-x7zbk"] Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.973517 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x7zbk" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.973624 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.980684 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-clrw9"] Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981052 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981237 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981387 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981437 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981592 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.981714 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.982161 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.982488 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.982662 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.983545 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.987386 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 18:55:04 crc kubenswrapper[4731]: I1203 18:55:04.987639 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.000736 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:04Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.018184 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.021815 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.021846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.021856 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.021873 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.021884 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.034510 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzldf" event={"ID":"b514e547-82a6-4f87-8879-aea6f4cd653d","Type":"ContainerStarted","Data":"e5e41a3b9b7d236b025f623c299075ca5fdc9f771278f4fcf232ac6ec3d53482"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.045711 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.079603 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.108944 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113381 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-rootfs\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113420 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-socket-dir-parent\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113443 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-bin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113552 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-cni-binary-copy\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113589 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-multus\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113753 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-os-release\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113826 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113867 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-cnibin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113889 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-k8s-cni-cncf-io\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113914 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-netns\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113938 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-system-cni-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.113976 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5v6x\" (UniqueName: \"kubernetes.io/projected/67102b0b-f85c-470b-8ba1-14eedddebae9-kube-api-access-j5v6x\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114010 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-hostroot\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114045 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvzsp\" (UniqueName: \"kubernetes.io/projected/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-kube-api-access-kvzsp\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114071 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbmfv\" (UniqueName: \"kubernetes.io/projected/4ee4f887-8ce3-42c9-9886-06bdf109800c-kube-api-access-kbmfv\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114159 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-os-release\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114187 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-proxy-tls\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114210 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-etc-kubernetes\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114234 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-kubelet\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114277 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-conf-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114302 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-binary-copy\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114359 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-multus-certs\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114383 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114454 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114475 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-daemon-config\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114497 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-system-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114516 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.114551 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-cnibin\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.125440 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.125475 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.125485 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.125502 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.125512 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.146136 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.169990 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.197537 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215429 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-os-release\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215470 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215513 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-cnibin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215530 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-k8s-cni-cncf-io\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215548 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-netns\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215566 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-system-cni-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215582 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5v6x\" (UniqueName: \"kubernetes.io/projected/67102b0b-f85c-470b-8ba1-14eedddebae9-kube-api-access-j5v6x\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215599 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-hostroot\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215619 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvzsp\" (UniqueName: \"kubernetes.io/projected/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-kube-api-access-kvzsp\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215641 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbmfv\" (UniqueName: \"kubernetes.io/projected/4ee4f887-8ce3-42c9-9886-06bdf109800c-kube-api-access-kbmfv\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215639 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-netns\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215661 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-os-release\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215762 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-os-release\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215731 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-system-cni-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215654 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-k8s-cni-cncf-io\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215873 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-proxy-tls\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215927 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-etc-kubernetes\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215947 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-kubelet\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215965 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-conf-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.215982 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-binary-copy\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216004 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-conf-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216028 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-multus-certs\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216038 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-kubelet\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216054 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216108 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216122 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-run-multus-certs\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216133 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-daemon-config\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216222 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-system-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216231 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-hostroot\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216286 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216300 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216314 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-cnibin\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216349 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-cnibin\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216356 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-rootfs\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216387 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-socket-dir-parent\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216389 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-rootfs\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216411 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-bin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216442 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-cni-binary-copy\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216461 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-multus\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216470 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-socket-dir-parent\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216497 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-system-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216524 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-bin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216559 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-host-var-lib-cni-multus\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216561 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-cni-dir\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216583 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-os-release\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216708 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67102b0b-f85c-470b-8ba1-14eedddebae9-cni-binary-copy\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216772 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216782 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-cnibin\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216790 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-multus-daemon-config\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.216865 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee4f887-8ce3-42c9-9886-06bdf109800c-etc-kubernetes\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.217193 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee4f887-8ce3-42c9-9886-06bdf109800c-cni-binary-copy\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.217175 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.225138 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67102b0b-f85c-470b-8ba1-14eedddebae9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.234703 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.234742 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.234751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.234773 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.234785 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.240905 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.241048 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvzsp\" (UniqueName: \"kubernetes.io/projected/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-kube-api-access-kvzsp\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.245094 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95dced4d-3fd5-43d3-b87d-21ec9c80de8b-proxy-tls\") pod \"machine-config-daemon-mmjcd\" (UID: \"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\") " pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.246929 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5v6x\" (UniqueName: \"kubernetes.io/projected/67102b0b-f85c-470b-8ba1-14eedddebae9-kube-api-access-j5v6x\") pod \"multus-additional-cni-plugins-clrw9\" (UID: \"67102b0b-f85c-470b-8ba1-14eedddebae9\") " pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.250905 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbmfv\" (UniqueName: \"kubernetes.io/projected/4ee4f887-8ce3-42c9-9886-06bdf109800c-kube-api-access-kbmfv\") pod \"multus-x7zbk\" (UID: \"4ee4f887-8ce3-42c9-9886-06bdf109800c\") " pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.266136 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.281330 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.299053 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.301485 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.306783 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x7zbk" Dec 03 18:55:05 crc kubenswrapper[4731]: W1203 18:55:05.309811 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95dced4d_3fd5_43d3_b87d_21ec9c80de8b.slice/crio-e54d67b10e5f15fb2a92c83b09c6909cbb37832bd780af2a815bb759c4b70ae9 WatchSource:0}: Error finding container e54d67b10e5f15fb2a92c83b09c6909cbb37832bd780af2a815bb759c4b70ae9: Status 404 returned error can't find the container with id e54d67b10e5f15fb2a92c83b09c6909cbb37832bd780af2a815bb759c4b70ae9 Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.313969 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-clrw9" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.315152 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: W1203 18:55:05.337090 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67102b0b_f85c_470b_8ba1_14eedddebae9.slice/crio-a75885298403946b7afdb5d42afe6dab4efdf3bdcb53296b7c5a4ff766027971 WatchSource:0}: Error finding container a75885298403946b7afdb5d42afe6dab4efdf3bdcb53296b7c5a4ff766027971: Status 404 returned error can't find the container with id a75885298403946b7afdb5d42afe6dab4efdf3bdcb53296b7c5a4ff766027971 Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337519 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337571 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337586 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337610 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337625 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.337663 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.355190 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.370180 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.387134 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.388188 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xcsvg"] Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.389162 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.393863 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.393999 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.394052 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.394217 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.394242 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.394454 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.394691 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.403138 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.417871 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.429539 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.440555 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.442159 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.442217 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.442231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.442268 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.442284 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.459421 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.474128 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.486142 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.499780 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.510722 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519383 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519421 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519471 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519501 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519526 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519558 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519574 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519627 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519660 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519679 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519699 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519715 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519732 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519751 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519897 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.519994 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.520033 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.520084 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.520110 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.520139 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjcq\" (UniqueName: \"kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.529539 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.544725 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.544761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.544771 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.544789 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.544799 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.551999 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.565489 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.578399 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.594542 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.608531 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620490 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620872 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620918 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620941 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjcq\" (UniqueName: \"kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620972 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.620992 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621026 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621044 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621063 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621106 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621126 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621154 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621169 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621185 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621205 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621220 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621237 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621275 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621306 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621311 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621322 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621348 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621361 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621388 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621420 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621423 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621452 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621484 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621490 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621522 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621491 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621594 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621613 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621695 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621753 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.621491 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.622121 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.622202 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.622330 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.624806 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.638286 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.639799 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjcq\" (UniqueName: \"kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq\") pod \"ovnkube-node-xcsvg\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.646758 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.646795 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.646808 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.646824 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.646836 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.656235 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.670039 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.681079 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:05Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.703502 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:05 crc kubenswrapper[4731]: W1203 18:55:05.713974 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2676769f_27dd_4ac2_9398_7322817ce55a.slice/crio-f96b492d102e1801cf1042e561926c64d5ff5475b4d9151bfd2bf249636e0f25 WatchSource:0}: Error finding container f96b492d102e1801cf1042e561926c64d5ff5475b4d9151bfd2bf249636e0f25: Status 404 returned error can't find the container with id f96b492d102e1801cf1042e561926c64d5ff5475b4d9151bfd2bf249636e0f25 Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.748890 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.748932 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.748943 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.748962 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.748973 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.851446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.851487 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.851499 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.851515 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.851524 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.954789 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.954846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.954861 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.954880 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:05 crc kubenswrapper[4731]: I1203 18:55:05.954893 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:05Z","lastTransitionTime":"2025-12-03T18:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.038288 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerStarted","Data":"200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.038335 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerStarted","Data":"863bf968169d9d0ce015846b1702dcd68c21b2986a5efd8b0573a4014d5c6119"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.040602 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.040639 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"e54d67b10e5f15fb2a92c83b09c6909cbb37832bd780af2a815bb759c4b70ae9"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.042502 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzldf" event={"ID":"b514e547-82a6-4f87-8879-aea6f4cd653d","Type":"ContainerStarted","Data":"d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.044670 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" exitCode=0 Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.044766 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.044809 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"f96b492d102e1801cf1042e561926c64d5ff5475b4d9151bfd2bf249636e0f25"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.047884 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerStarted","Data":"867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.047926 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerStarted","Data":"a75885298403946b7afdb5d42afe6dab4efdf3bdcb53296b7c5a4ff766027971"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.056907 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.056947 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.056956 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.056971 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.056981 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.062480 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.075219 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.092596 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.111129 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.121610 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.141719 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.155905 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.160206 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.160232 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.160242 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.160272 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.160283 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.170915 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.186226 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.203439 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.225950 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.238911 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.255780 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.262830 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.263018 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.263097 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.263161 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.263218 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.273612 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.290359 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.303965 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.317788 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.330278 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.343946 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.354135 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.365889 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.365926 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.365935 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.365949 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.365960 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.370907 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.390015 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.403075 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.417833 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.430771 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.443107 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.457791 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.468221 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.468536 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.468620 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.468713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.468806 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.472534 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:06Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.571939 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.572434 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.572448 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.572462 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.572472 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.631530 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.631765 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:55:14.631735951 +0000 UTC m=+35.230330415 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.632161 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.632234 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.632302 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.632325 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632483 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632503 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632514 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632549 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:14.632540986 +0000 UTC m=+35.231135450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632668 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632686 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632753 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632767 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:14.632749093 +0000 UTC m=+35.231343557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632775 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632871 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:14.632844966 +0000 UTC m=+35.231439480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.632488 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.633116 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:14.633093664 +0000 UTC m=+35.231688118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.674855 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.675104 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.675170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.675243 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.675349 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.778767 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.778807 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.778819 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.778838 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.778850 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.855995 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.856021 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.855995 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.856155 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.856199 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:06 crc kubenswrapper[4731]: E1203 18:55:06.856277 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.881359 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.881408 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.881420 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.881438 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.881453 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.984393 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.984442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.984458 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.984481 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:06 crc kubenswrapper[4731]: I1203 18:55:06.984494 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:06Z","lastTransitionTime":"2025-12-03T18:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.056342 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa" exitCode=0 Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.056452 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.059285 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.071977 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.087558 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.087596 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.087606 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.087623 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.087635 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.088346 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.102365 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.115839 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.144583 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.162666 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.190513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.190541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.190550 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.190564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.190573 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.193020 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.220753 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.245374 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.259194 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.274082 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.285580 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.292455 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.292514 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.292528 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.292547 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.292557 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.296921 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.320329 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.338430 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.350544 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.364742 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.378197 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.393353 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.394864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.394905 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.394951 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.394973 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.394983 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.412062 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.432417 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.447304 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.461246 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.476043 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.487649 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.497293 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.497357 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.497372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.497430 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.497450 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.502035 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.518641 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.531009 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.600197 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.600246 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.600300 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.600381 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.600399 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.605586 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gkw94"] Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.606102 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.608648 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.609894 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.609937 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.610070 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.631379 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.647762 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.661439 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.678864 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.690696 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.703523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.703738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.703810 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.703877 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.703941 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.710116 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.726793 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.740031 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.745767 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c0c2a2a-2759-494b-b107-06d0367eb3ed-serviceca\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.745827 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c0c2a2a-2759-494b-b107-06d0367eb3ed-host\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.745868 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvskp\" (UniqueName: \"kubernetes.io/projected/4c0c2a2a-2759-494b-b107-06d0367eb3ed-kube-api-access-fvskp\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.754648 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.767775 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.800480 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.806195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.806465 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.806549 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.806621 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.806685 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.838079 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.846553 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c0c2a2a-2759-494b-b107-06d0367eb3ed-host\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.846621 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvskp\" (UniqueName: \"kubernetes.io/projected/4c0c2a2a-2759-494b-b107-06d0367eb3ed-kube-api-access-fvskp\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.846642 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c0c2a2a-2759-494b-b107-06d0367eb3ed-serviceca\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.846864 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c0c2a2a-2759-494b-b107-06d0367eb3ed-host\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.847622 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c0c2a2a-2759-494b-b107-06d0367eb3ed-serviceca\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.885046 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvskp\" (UniqueName: \"kubernetes.io/projected/4c0c2a2a-2759-494b-b107-06d0367eb3ed-kube-api-access-fvskp\") pod \"node-ca-gkw94\" (UID: \"4c0c2a2a-2759-494b-b107-06d0367eb3ed\") " pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.895649 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.909348 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.909414 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.909427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.909450 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.909462 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:07Z","lastTransitionTime":"2025-12-03T18:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.918215 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gkw94" Dec 03 18:55:07 crc kubenswrapper[4731]: W1203 18:55:07.930659 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0c2a2a_2759_494b_b107_06d0367eb3ed.slice/crio-88db5d1700d36d48727eed7d7a1bbf36a7e7e4420c6b97d3667dd732121db4a9 WatchSource:0}: Error finding container 88db5d1700d36d48727eed7d7a1bbf36a7e7e4420c6b97d3667dd732121db4a9: Status 404 returned error can't find the container with id 88db5d1700d36d48727eed7d7a1bbf36a7e7e4420c6b97d3667dd732121db4a9 Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.938920 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:07 crc kubenswrapper[4731]: I1203 18:55:07.977917 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:07Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.011589 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.011633 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.011643 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.011664 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.011676 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.064890 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.065340 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.065360 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.065373 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.065383 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.065863 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gkw94" event={"ID":"4c0c2a2a-2759-494b-b107-06d0367eb3ed","Type":"ContainerStarted","Data":"88db5d1700d36d48727eed7d7a1bbf36a7e7e4420c6b97d3667dd732121db4a9"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.067450 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be" exitCode=0 Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.067557 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.090130 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.109720 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.113555 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.113582 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.113591 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.113604 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.113615 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.127468 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.141315 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.177818 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.220940 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.221008 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.221021 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.221046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.221071 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.222385 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.259426 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.298743 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.324942 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.325005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.325018 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.325039 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.325053 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.340378 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.379014 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.419237 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.428517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.428601 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.428613 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.428636 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.428652 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.459467 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.498662 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.531598 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.531653 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.531666 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.531686 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.531698 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.538819 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.581622 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:08Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.634617 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.634667 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.634676 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.634696 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.634711 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.738211 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.738288 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.738299 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.738316 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.738330 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.841490 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.841571 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.841592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.841627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.841648 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.855881 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:08 crc kubenswrapper[4731]: E1203 18:55:08.856016 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.855881 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.856068 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:08 crc kubenswrapper[4731]: E1203 18:55:08.856086 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:08 crc kubenswrapper[4731]: E1203 18:55:08.856201 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.944271 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.944324 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.944335 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.944355 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:08 crc kubenswrapper[4731]: I1203 18:55:08.944368 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:08Z","lastTransitionTime":"2025-12-03T18:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.075766 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.075843 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.075865 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.075892 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.075912 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.077950 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gkw94" event={"ID":"4c0c2a2a-2759-494b-b107-06d0367eb3ed","Type":"ContainerStarted","Data":"aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.081892 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f" exitCode=0 Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.082010 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.087003 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.097655 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.114321 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.131384 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.147383 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.163901 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.177888 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.177946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.177960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.177985 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.178003 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.186991 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.203370 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.215769 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.229767 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.242924 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.265384 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.281075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.281117 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.281128 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.281150 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.281159 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.284308 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.303055 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.316382 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.329998 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.342706 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.355105 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.381536 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.384011 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.384078 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.384096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.384125 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.384146 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.412838 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.433620 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.450544 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.472833 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.486599 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.486635 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.486646 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.486665 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.486680 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.499019 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.538251 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.577459 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.589101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.589151 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.589164 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.589186 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.589203 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.618064 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.665734 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.692083 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.692147 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.692167 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.692194 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.692214 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.703937 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.739068 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.778974 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.795711 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.795765 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.795777 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.795805 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.795822 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.882867 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.898393 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.898437 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.898446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.898461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.898472 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:09Z","lastTransitionTime":"2025-12-03T18:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.899224 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.911652 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.939444 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:09 crc kubenswrapper[4731]: I1203 18:55:09.978390 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:09Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.001292 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.001329 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.001347 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.001372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.001390 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.024352 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.060164 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.094736 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70" exitCode=0 Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.095016 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.101527 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.104869 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.104929 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.104954 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.104984 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.105007 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.145885 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.178473 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.208979 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.209013 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.209023 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.209042 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.209109 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.223752 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.265371 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.298633 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.310882 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.310917 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.310926 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.310944 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.310955 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.339001 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.378920 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.418538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.418595 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.418614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.418656 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.418669 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.427959 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.461575 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.499685 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.521987 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.522056 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.522075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.522101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.522120 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.539505 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.575508 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.624817 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.624870 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.624884 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.624904 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.624916 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.628197 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.659990 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.698384 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.727967 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.728025 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.728036 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.728061 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.728075 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.737653 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.774816 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.820484 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.831033 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.831088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.831099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.831122 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.831134 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.855368 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.855441 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:10 crc kubenswrapper[4731]: E1203 18:55:10.855502 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.855526 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:10 crc kubenswrapper[4731]: E1203 18:55:10.855629 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:10 crc kubenswrapper[4731]: E1203 18:55:10.855729 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.859447 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.899137 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.934004 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.934053 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.934063 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.934082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.934095 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:10Z","lastTransitionTime":"2025-12-03T18:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.939645 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:10 crc kubenswrapper[4731]: I1203 18:55:10.979407 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:10Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.037011 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.037051 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.037063 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.037079 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.037094 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.101508 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.103868 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655" exitCode=0 Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.103897 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.124007 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.136832 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.139314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.139352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.139367 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.139389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.139405 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.154378 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.170003 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.190512 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.220460 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.241873 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.241936 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.241948 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.241971 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.241984 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.260912 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.299417 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.336666 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.345446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.345755 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.345782 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.345806 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.345824 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.377158 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.421103 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.449100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.449157 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.449170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.449189 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.449202 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.463835 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.503581 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.543631 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.552546 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.552621 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.552641 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.552674 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.552894 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.578670 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.656618 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.656706 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.656730 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.656761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.656780 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.658307 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.658342 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.658352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.658365 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.658373 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.679392 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.685121 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.685208 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.685234 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.685302 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.685330 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.706393 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.711559 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.711601 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.711612 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.711634 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.711648 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.726870 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.732661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.732718 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.732731 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.732751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.732766 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.747578 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.751657 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.751716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.751729 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.751753 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.751772 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.766949 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:11Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:11 crc kubenswrapper[4731]: E1203 18:55:11.767122 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.768857 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.768909 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.768920 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.768936 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.768946 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.871520 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.871557 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.871567 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.871580 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.871589 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.974194 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.974311 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.974355 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.974396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:11 crc kubenswrapper[4731]: I1203 18:55:11.974422 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:11Z","lastTransitionTime":"2025-12-03T18:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.077769 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.077829 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.077841 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.077866 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.077883 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.180616 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.180655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.180665 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.180684 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.180695 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.283946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.284015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.284027 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.284052 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.284069 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.290853 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.317291 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.337357 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.354232 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.369331 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.383176 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.387372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.387409 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.387427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.387452 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.387472 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.404016 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.417486 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.429652 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.441890 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.453154 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.466715 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.478958 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490483 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490497 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490527 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.490550 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.503595 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.516891 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:12Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.592927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.593019 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.593054 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.593089 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.593115 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.696401 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.696718 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.696731 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.696751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.696769 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.799451 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.799529 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.799544 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.799564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.799576 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.855160 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.855160 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:12 crc kubenswrapper[4731]: E1203 18:55:12.855427 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.855678 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:12 crc kubenswrapper[4731]: E1203 18:55:12.855933 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:12 crc kubenswrapper[4731]: E1203 18:55:12.856033 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.903344 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.903695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.903855 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.904008 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:12 crc kubenswrapper[4731]: I1203 18:55:12.904146 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:12Z","lastTransitionTime":"2025-12-03T18:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.007171 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.007239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.007272 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.007296 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.007309 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.110791 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.110849 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.110864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.110889 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.110904 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.116567 4731 generic.go:334] "Generic (PLEG): container finished" podID="67102b0b-f85c-470b-8ba1-14eedddebae9" containerID="200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3" exitCode=0 Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.116723 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerDied","Data":"200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.126286 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.126618 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.136755 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.152229 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.161742 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.174061 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.194620 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.215411 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.215461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.215471 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.215494 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.215505 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.216289 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.237029 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.251952 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.269147 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.283232 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.299226 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.316740 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.318875 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.318964 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.319197 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.320010 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.320198 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.332239 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.350524 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.368906 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.390358 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.416351 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.422327 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.422382 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.422395 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.422414 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.422428 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.430588 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.445430 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.459736 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.472921 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.497150 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.513984 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.525291 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.525347 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.525360 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.525383 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.525399 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.532823 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.550523 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.562207 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.576974 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.590013 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.602057 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.616744 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.628324 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.628366 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.628378 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.628402 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.628417 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.636852 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:13Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.731586 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.731655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.731668 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.731691 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.731708 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.835756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.835831 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.835851 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.835880 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.835901 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.939400 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.939466 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.939484 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.939513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:13 crc kubenswrapper[4731]: I1203 18:55:13.939534 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:13Z","lastTransitionTime":"2025-12-03T18:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.044307 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.044383 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.044409 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.044443 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.044468 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.139244 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" event={"ID":"67102b0b-f85c-470b-8ba1-14eedddebae9","Type":"ContainerStarted","Data":"8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.139432 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.140397 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.147853 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.147906 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.147927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.147955 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.147976 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.177832 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.182646 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.198647 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.216545 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.234505 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251416 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251467 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251480 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251503 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251518 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.251519 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.270478 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.282437 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.294027 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.306850 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.315956 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.328121 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.342062 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.353153 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.354528 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.354586 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.354598 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.354623 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.354642 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.368540 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.387313 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.401789 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.416349 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.429990 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.441851 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.453655 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.457632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.457684 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.457722 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.457748 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.457765 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.473746 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.486662 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.499046 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.510235 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.521487 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.532307 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.550180 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.563203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.563307 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.563330 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.563360 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.563389 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.576529 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.588577 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.600721 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:14Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.635414 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.635549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.635573 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635610 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:55:30.635576999 +0000 UTC m=+51.234171453 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.635642 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.635674 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635711 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635727 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635739 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635743 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635777 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:30.635769284 +0000 UTC m=+51.234363748 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635792 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:30.635786424 +0000 UTC m=+51.234380888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635815 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635831 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635840 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635842 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:30.635833446 +0000 UTC m=+51.234427910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635847 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.635875 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:30.635867727 +0000 UTC m=+51.234462191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.666738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.666780 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.666991 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.667009 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.667022 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.772292 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.772353 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.772372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.772392 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.772404 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.855412 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.855450 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.855494 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.855565 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.855671 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:14 crc kubenswrapper[4731]: E1203 18:55:14.855808 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.875088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.875140 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.875154 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.875175 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:14 crc kubenswrapper[4731]: I1203 18:55:14.875529 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:14Z","lastTransitionTime":"2025-12-03T18:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.026427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.026482 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.026499 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.026522 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.026577 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.128853 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.128896 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.128908 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.128928 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.128941 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.142421 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.231709 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.232055 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.232152 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.232302 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.232423 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.336112 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.336162 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.336178 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.336198 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.336210 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.439371 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.439418 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.439432 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.439457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.439473 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.542018 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.542069 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.542081 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.542100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.542114 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.644513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.644557 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.644575 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.644591 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.644600 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.747584 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.747621 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.747629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.747645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.747657 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.849712 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.849757 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.849769 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.849788 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.849804 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.952103 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.952164 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.952177 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.952192 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:15 crc kubenswrapper[4731]: I1203 18:55:15.952201 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:15Z","lastTransitionTime":"2025-12-03T18:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.054545 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.054608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.054619 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.054638 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.054717 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.146358 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.157074 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.157138 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.157157 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.157185 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.157206 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.260733 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.260802 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.260823 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.260852 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.260873 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.364476 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.364526 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.364538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.364556 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.364594 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.467619 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.467676 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.467697 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.467719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.467736 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.571353 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.571408 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.571418 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.571439 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.571455 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.673729 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.673775 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.673783 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.673797 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.673810 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.775871 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.775918 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.775929 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.775947 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.775958 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.856044 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.856114 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.856124 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:16 crc kubenswrapper[4731]: E1203 18:55:16.856366 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:16 crc kubenswrapper[4731]: E1203 18:55:16.856468 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:16 crc kubenswrapper[4731]: E1203 18:55:16.856567 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.879624 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.879688 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.879714 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.879749 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.879774 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.940857 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb"] Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.941905 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.944538 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.944711 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.958103 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:16Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.974418 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:16Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.982735 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.982810 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.982827 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.982848 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.982861 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:16Z","lastTransitionTime":"2025-12-03T18:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:16 crc kubenswrapper[4731]: I1203 18:55:16.992472 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:16Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.006668 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.021027 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.033280 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.045788 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.058730 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.062609 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.062667 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.062691 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.062711 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2cxj\" (UniqueName: \"kubernetes.io/projected/33e88da1-3764-46ce-a91f-fe154f3a9dfe-kube-api-access-q2cxj\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.072601 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085187 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085277 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085291 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.085799 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.099099 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.119402 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.138944 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.151058 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/0.log" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.154549 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60" exitCode=1 Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.154611 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.155773 4731 scope.go:117] "RemoveContainer" containerID="a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.163721 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.163767 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2cxj\" (UniqueName: \"kubernetes.io/projected/33e88da1-3764-46ce-a91f-fe154f3a9dfe-kube-api-access-q2cxj\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.163832 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.163857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.164869 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.165060 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.168470 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.172332 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33e88da1-3764-46ce-a91f-fe154f3a9dfe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.186224 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2cxj\" (UniqueName: \"kubernetes.io/projected/33e88da1-3764-46ce-a91f-fe154f3a9dfe-kube-api-access-q2cxj\") pod \"ovnkube-control-plane-749d76644c-frxlb\" (UID: \"33e88da1-3764-46ce-a91f-fe154f3a9dfe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.187513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.187536 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.187547 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.187565 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.187577 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.189576 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.206831 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.223193 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.236002 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.256868 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.262164 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.281545 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.290030 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.290080 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.290091 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.290110 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.290124 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.297092 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.311512 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.324971 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.336551 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.352986 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.372382 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.393675 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.394018 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.394057 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.394068 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.394083 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.394093 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.409382 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.423974 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.441583 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.454150 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.466664 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:17Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.496993 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.497040 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.497054 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.497079 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.497098 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.599550 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.599612 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.599623 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.599641 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.599652 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.703632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.703705 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.703730 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.703761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.703784 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.806950 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.807018 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.807034 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.807061 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.807077 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.910705 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.910750 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.910762 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.910781 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:17 crc kubenswrapper[4731]: I1203 18:55:17.910791 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:17Z","lastTransitionTime":"2025-12-03T18:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.013846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.013892 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.013901 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.013915 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.013926 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.117645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.117721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.117745 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.117777 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.117799 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.160554 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" event={"ID":"33e88da1-3764-46ce-a91f-fe154f3a9dfe","Type":"ContainerStarted","Data":"05e01341161a63e80b5a10589ddea70d7758c2af37c0cfd7ae4ebb750f84a37b"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.221387 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.221471 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.221491 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.221528 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.221549 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.324630 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.324688 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.324708 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.324731 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.324747 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.427798 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.427864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.427884 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.427908 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.427923 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.530715 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.530756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.530772 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.530793 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.530806 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.633181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.633233 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.633244 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.633283 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.633297 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.736661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.736704 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.736717 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.736735 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.736749 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.807374 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-p6zls"] Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.807944 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.808028 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.824608 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.838396 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.839670 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.839748 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.839768 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.839792 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.839808 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.854632 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.854989 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.855049 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.855095 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.855284 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.855707 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.855831 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.867863 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.883884 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.883960 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrt7\" (UniqueName: \"kubernetes.io/projected/83957f97-f30b-4ea7-8849-c7264d61fd52-kube-api-access-8nrt7\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.888493 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.905457 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.916511 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.938947 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.944873 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.944943 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.944971 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.944996 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.945025 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:18Z","lastTransitionTime":"2025-12-03T18:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.961043 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.984610 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nrt7\" (UniqueName: \"kubernetes.io/projected/83957f97-f30b-4ea7-8849-c7264d61fd52-kube-api-access-8nrt7\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.984722 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.984929 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:18 crc kubenswrapper[4731]: E1203 18:55:18.985045 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:19.485020648 +0000 UTC m=+40.083615112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:18 crc kubenswrapper[4731]: I1203 18:55:18.995489 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:18Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.011002 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nrt7\" (UniqueName: \"kubernetes.io/projected/83957f97-f30b-4ea7-8849-c7264d61fd52-kube-api-access-8nrt7\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.015821 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.034475 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.047686 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.047716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.047725 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.047738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.047747 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.048693 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.059369 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.072588 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.083665 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.094797 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.150485 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.150525 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.150538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.150556 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.150566 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.165870 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/0.log" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.170014 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.170356 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.174078 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" event={"ID":"33e88da1-3764-46ce-a91f-fe154f3a9dfe","Type":"ContainerStarted","Data":"5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.174123 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" event={"ID":"33e88da1-3764-46ce-a91f-fe154f3a9dfe","Type":"ContainerStarted","Data":"d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.189337 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.214274 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.225647 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.238823 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.252159 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.253495 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.253531 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.253544 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.253563 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.253577 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.269889 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.289162 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.311330 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.329484 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.340181 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.352114 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.355596 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.355627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.355637 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.355679 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.355691 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.363499 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.374054 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.392752 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.402963 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.424195 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.437969 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.456397 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.458042 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.458075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.458085 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.458100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.458109 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.474366 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.485945 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.490237 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:19 crc kubenswrapper[4731]: E1203 18:55:19.490471 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:19 crc kubenswrapper[4731]: E1203 18:55:19.490581 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:20.490559426 +0000 UTC m=+41.089153890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.496947 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.506699 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.524810 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.537628 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.550221 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.560606 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.560676 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.560686 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.560702 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.560715 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.567380 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.581479 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.593675 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.617500 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.632112 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.643832 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.658094 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.663389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.663424 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.663436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.663456 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.663467 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.672862 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.687289 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.766719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.766771 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.766784 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.766806 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.766819 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868336 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868646 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868860 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868876 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.868888 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.882824 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.895716 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.905654 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.919811 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.933504 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.944879 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.961656 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.971099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.971142 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.971154 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.971172 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.971183 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:19Z","lastTransitionTime":"2025-12-03T18:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.976304 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.986928 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:19 crc kubenswrapper[4731]: I1203 18:55:19.996321 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.013179 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.023856 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.034842 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.045178 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.058274 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.074146 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.074176 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.074186 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.074200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.074211 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.076838 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.176516 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.176556 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.176569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.176587 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.176599 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.179067 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/1.log" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.179538 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/0.log" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.182188 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff" exitCode=1 Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.182248 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.182326 4731 scope.go:117] "RemoveContainer" containerID="a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.183134 4731 scope.go:117] "RemoveContainer" containerID="130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.183396 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.201599 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.213872 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.225399 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.236987 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.245764 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.261718 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a527c012a56589f74838783b624e3b03a83bcaf45b20c95d76a615fcbd0dea60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"message\\\":\\\"crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 18:55:16.468886 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 18:55:16.468915 6052 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 18:55:16.469017 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 18:55:16.469079 6052 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 18:55:16.469116 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 18:55:16.469128 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 18:55:16.469133 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 18:55:16.469164 6052 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 18:55:16.469168 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 18:55:16.469196 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 18:55:16.469166 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 18:55:16.469271 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 18:55:16.469290 6052 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 18:55:16.469303 6052 factory.go:656] Stopping watch factory\\\\nI1203 18:55:16.469378 6052 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.272950 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.279533 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.279581 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.279594 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.279613 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.279626 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.283802 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.294814 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.307844 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.317911 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.333121 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.345364 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.357660 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.370589 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.382110 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.382304 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.382425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.382528 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.382614 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.385128 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.398075 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:20Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.485718 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.485755 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.485765 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.485782 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.485794 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.502364 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.502561 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.502801 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:22.50278489 +0000 UTC m=+43.101379344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.588715 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.588756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.588765 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.588779 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.588788 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.692203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.692282 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.692297 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.692321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.692334 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.795268 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.795310 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.795320 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.795336 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.795348 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.855115 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.855214 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.855214 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.855179 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.855358 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.855483 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.855576 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:20 crc kubenswrapper[4731]: E1203 18:55:20.855690 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.904127 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.904179 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.904191 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.904220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:20 crc kubenswrapper[4731]: I1203 18:55:20.904241 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:20Z","lastTransitionTime":"2025-12-03T18:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.007682 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.007715 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.007724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.007736 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.007745 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.109994 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.110029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.110041 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.110053 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.110063 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.187150 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/1.log" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.213125 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.213208 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.213232 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.213303 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.213331 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.316559 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.316639 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.316654 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.316674 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.316687 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.419373 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.419442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.419455 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.419472 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.419482 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.522297 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.522337 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.522348 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.522366 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.522382 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.624567 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.624624 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.624635 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.624654 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.624667 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.717754 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.718678 4731 scope.go:117] "RemoveContainer" containerID="130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.718833 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.727285 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.727324 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.727336 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.727352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.727363 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.748718 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.763372 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.777106 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.780381 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.780420 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.780433 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.780449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.780459 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.791931 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.798533 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803752 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803762 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803778 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803789 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.803916 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.814834 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.818840 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.818906 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.818924 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.818953 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.818976 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.830702 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.834668 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.838317 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.838358 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.838375 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.838396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.838414 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.843451 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.853634 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.856187 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.857108 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.857135 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.857144 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.857161 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.857172 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.877540 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: E1203 18:55:21.877713 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879310 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879342 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879354 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879386 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.879793 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.895926 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.908859 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.925962 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.940759 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.954328 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.968133 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981351 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981902 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981941 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981953 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981971 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.981985 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:21Z","lastTransitionTime":"2025-12-03T18:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:21 crc kubenswrapper[4731]: I1203 18:55:21.992039 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:21Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.085202 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.085284 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.085297 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.085317 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.085332 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.188200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.188244 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.188287 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.188311 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.188324 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.290594 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.290651 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.290662 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.290675 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.290684 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.392898 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.392955 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.392970 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.392990 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.393004 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.495398 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.495436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.495444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.495457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.495467 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.524044 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.524273 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.524343 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:26.524327375 +0000 UTC m=+47.122921839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.598074 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.598139 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.598157 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.598182 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.598200 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.701082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.701129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.701148 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.701169 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.701181 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.803968 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.804008 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.804019 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.804037 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.804047 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.855888 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.855924 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.856176 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.855986 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.855950 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.856346 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.856477 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:22 crc kubenswrapper[4731]: E1203 18:55:22.856581 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.906885 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.906974 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.906991 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.907013 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:22 crc kubenswrapper[4731]: I1203 18:55:22.907031 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:22Z","lastTransitionTime":"2025-12-03T18:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.009646 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.009691 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.009703 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.009721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.009734 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.112133 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.112169 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.112178 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.112194 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.112204 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.214165 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.214216 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.214229 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.214281 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.214312 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.317033 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.317082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.317094 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.317114 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.317125 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.420017 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.420074 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.420086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.420105 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.420117 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.523111 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.523165 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.523178 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.523199 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.523217 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.626459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.626700 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.626720 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.626747 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.626766 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.729874 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.729960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.729980 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.730005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.730022 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.832512 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.832555 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.832565 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.832578 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.832587 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.935995 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.936399 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.936557 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.936698 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:23 crc kubenswrapper[4731]: I1203 18:55:23.936841 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:23Z","lastTransitionTime":"2025-12-03T18:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.040798 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.041342 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.041508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.041666 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.041819 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.144737 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.144788 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.144797 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.144814 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.144824 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.247373 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.247446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.247468 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.247501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.247521 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.350672 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.350727 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.350742 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.350764 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.350778 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.454587 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.454631 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.454642 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.454661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.454674 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.557558 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.557599 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.557608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.557628 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.557639 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.660304 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.660382 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.660392 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.660411 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.660425 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.763132 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.763202 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.763221 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.763250 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.763304 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.855440 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.855455 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.855484 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:24 crc kubenswrapper[4731]: E1203 18:55:24.856355 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.855494 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:24 crc kubenswrapper[4731]: E1203 18:55:24.855991 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:24 crc kubenswrapper[4731]: E1203 18:55:24.856508 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:24 crc kubenswrapper[4731]: E1203 18:55:24.856688 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.866389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.866448 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.866467 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.866495 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.866519 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.969920 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.969967 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.969977 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.969994 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:24 crc kubenswrapper[4731]: I1203 18:55:24.970004 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:24Z","lastTransitionTime":"2025-12-03T18:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.073780 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.073840 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.073861 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.073890 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.073910 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.176480 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.176629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.176644 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.176663 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.176675 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.279370 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.279422 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.279439 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.279466 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.279482 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.382015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.382053 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.382065 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.382082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.382094 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.484227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.484352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.484372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.484395 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.484417 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.587132 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.587170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.587181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.587197 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.587208 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.690696 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.690757 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.690773 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.690802 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.690827 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.794086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.794182 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.794203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.794231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.794459 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.897455 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.897508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.897521 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.897543 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:25 crc kubenswrapper[4731]: I1203 18:55:25.897557 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:25Z","lastTransitionTime":"2025-12-03T18:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.000047 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.000100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.000115 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.000140 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.000156 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.103385 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.103456 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.103474 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.103500 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.103521 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.206489 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.206569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.206592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.206620 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.206645 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.310831 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.310895 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.310916 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.310941 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.310961 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.414141 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.414204 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.414219 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.414241 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.414284 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.518079 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.518146 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.518164 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.518192 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.518207 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.570970 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.571292 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.571431 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:34.571395979 +0000 UTC m=+55.169990483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.622114 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.622185 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.622203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.622232 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.622288 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.725696 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.725742 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.725751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.725768 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.725778 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.828738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.828801 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.828812 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.828833 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.828847 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.855440 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.855481 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.855536 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.855606 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.855490 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.855802 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.855879 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:26 crc kubenswrapper[4731]: E1203 18:55:26.855972 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.931692 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.931733 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.931746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.931762 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:26 crc kubenswrapper[4731]: I1203 18:55:26.931774 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:26Z","lastTransitionTime":"2025-12-03T18:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.034896 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.034966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.034981 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.035005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.035020 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.138236 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.138314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.138326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.138344 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.138354 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.240727 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.240783 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.240795 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.240815 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.240829 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.344015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.344102 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.344120 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.344149 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.344176 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.447869 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.447931 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.447946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.447971 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.447982 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.551369 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.551433 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.551444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.551466 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.551478 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.654122 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.654200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.654223 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.654294 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.654315 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.757341 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.757389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.757401 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.757417 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.757430 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.860846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.861297 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.861329 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.861355 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.861374 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.964949 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.965326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.965457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.965578 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:27 crc kubenswrapper[4731]: I1203 18:55:27.965661 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:27Z","lastTransitionTime":"2025-12-03T18:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.068151 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.068203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.068216 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.068239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.068280 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.171271 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.171326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.171335 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.171353 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.171363 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.274122 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.274166 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.274182 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.274200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.274212 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.338180 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.351349 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.354085 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.367438 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.381030 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.381077 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.381089 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.381119 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.381137 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.387117 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.398548 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.417151 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.429175 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.440083 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.459930 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.479462 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.484034 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.484071 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.484080 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.484096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.484107 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.499352 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.522028 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.536515 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.571030 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.587591 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.587651 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.587698 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.587724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.587739 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.595340 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.611332 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.630624 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.646129 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:28Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.691365 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.691442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.691471 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.691509 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.691536 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.795165 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.795208 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.795220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.795241 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.795273 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.855346 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.855407 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.855419 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:28 crc kubenswrapper[4731]: E1203 18:55:28.855469 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.855645 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:28 crc kubenswrapper[4731]: E1203 18:55:28.855833 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:28 crc kubenswrapper[4731]: E1203 18:55:28.856005 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:28 crc kubenswrapper[4731]: E1203 18:55:28.856191 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.897405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.897452 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.897467 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.897487 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:28 crc kubenswrapper[4731]: I1203 18:55:28.897500 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:28Z","lastTransitionTime":"2025-12-03T18:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.000746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.000795 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.000805 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.000826 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.000839 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.104835 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.104916 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.104936 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.104966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.104989 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.208604 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.208771 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.208793 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.208822 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.208840 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.312673 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.312744 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.312772 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.312822 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.312847 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.416449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.416520 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.416531 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.416553 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.416566 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.520015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.520086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.520105 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.520138 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.520159 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.623495 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.623563 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.623576 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.623601 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.623616 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.726836 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.726902 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.726915 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.726939 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.726954 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.829649 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.829719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.829738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.829769 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.829789 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.882704 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.902875 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.921025 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.938452 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.938523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.938538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.938557 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.938571 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:29Z","lastTransitionTime":"2025-12-03T18:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.944892 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.966232 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:29 crc kubenswrapper[4731]: I1203 18:55:29.985844 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:29Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.003729 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.026304 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.041424 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.041482 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.041501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.041530 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.041550 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.050419 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.065308 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.082881 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.098777 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.122377 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.139067 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.144239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.144330 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.144354 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.144388 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.144413 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.158082 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.176055 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.196294 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.213047 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:30Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.248226 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.248339 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.248362 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.248391 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.248412 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.351592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.351658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.351678 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.351708 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.351727 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.455637 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.455689 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.455700 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.455720 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.455733 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.558839 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.558901 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.558922 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.558948 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.558968 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.661580 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.661645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.661663 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.661685 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.661704 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.717628 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.717896 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:02.717849348 +0000 UTC m=+83.316443862 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.717995 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.718050 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.718127 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.718167 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718297 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718333 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718369 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718403 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718422 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718375 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718492 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718376 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:02.718356964 +0000 UTC m=+83.316951428 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718339 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718544 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:02.718533279 +0000 UTC m=+83.317127933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718731 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:02.718694724 +0000 UTC m=+83.317289268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.718771 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:02.718757186 +0000 UTC m=+83.317351680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.764383 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.764463 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.764487 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.764523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.764547 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.855876 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.855914 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.855913 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.855876 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.856061 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.856207 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.856423 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:30 crc kubenswrapper[4731]: E1203 18:55:30.856646 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.867385 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.867425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.867437 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.867456 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.867469 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.970402 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.970517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.970538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.970569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:30 crc kubenswrapper[4731]: I1203 18:55:30.970587 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:30Z","lastTransitionTime":"2025-12-03T18:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.073722 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.073774 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.073783 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.073801 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.073810 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.176057 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.176116 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.176126 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.176144 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.176155 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.280010 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.280080 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.280094 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.280115 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.280129 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.383414 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.383502 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.383534 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.383569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.383589 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.486125 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.486195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.486214 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.486241 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.486319 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.589751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.589807 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.589816 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.589836 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.589848 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.692391 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.692427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.692436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.692449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.692458 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.795727 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.795776 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.795788 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.795807 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.795821 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.898642 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.898689 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.898699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.898716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:31 crc kubenswrapper[4731]: I1203 18:55:31.898725 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:31Z","lastTransitionTime":"2025-12-03T18:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.002155 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.002227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.002249 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.002331 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.002356 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.105124 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.105193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.105208 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.105224 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.105234 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.178371 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.178426 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.178439 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.178458 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.178470 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.192419 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:32Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.196431 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.196487 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.196506 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.196532 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.196551 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.211958 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:32Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.217189 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.217232 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.217241 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.217274 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.217284 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.232491 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:32Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.236096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.236135 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.236147 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.236162 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.236171 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.250578 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:32Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.253946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.253987 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.253999 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.254015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.254026 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.269947 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:32Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.270134 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.271643 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.271696 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.271707 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.271722 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.271734 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.374520 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.374560 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.374569 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.374587 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.374597 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.477198 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.477240 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.477263 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.477280 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.477295 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.580968 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.581009 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.581019 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.581038 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.581048 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.684793 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.684856 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.684873 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.684898 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.684915 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.788425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.788477 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.788489 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.788508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.788522 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.855965 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.856049 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.855994 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.856165 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.856112 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.856229 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.856331 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:32 crc kubenswrapper[4731]: E1203 18:55:32.856469 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.891622 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.891662 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.891676 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.891693 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.891705 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.994620 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.994678 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.994695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.994716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:32 crc kubenswrapper[4731]: I1203 18:55:32.994733 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:32Z","lastTransitionTime":"2025-12-03T18:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.097179 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.097225 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.097236 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.097281 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.097305 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.200536 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.200584 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.200596 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.200614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.200627 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.303822 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.303895 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.303914 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.303945 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.303968 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.407796 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.407881 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.407907 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.407941 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.407966 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.511129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.511203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.511221 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.511246 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.511297 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.614168 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.614233 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.614250 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.614329 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.614356 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.717964 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.718007 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.718016 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.718034 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.718043 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.821446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.821503 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.821517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.821540 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.821555 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.924088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.924133 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.924143 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.924158 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:33 crc kubenswrapper[4731]: I1203 18:55:33.924167 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:33Z","lastTransitionTime":"2025-12-03T18:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.026635 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.026674 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.026687 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.026706 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.026719 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.129461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.129515 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.129527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.129547 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.129561 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.231740 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.231829 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.231843 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.231865 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.231878 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.334114 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.334159 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.334171 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.334187 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.334196 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.437103 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.437174 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.437193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.437221 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.437247 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.540154 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.540207 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.540220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.540240 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.540278 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.643321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.643389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.643412 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.643436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.643453 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.666493 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.666721 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.666816 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:55:50.666797577 +0000 UTC m=+71.265392041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.746526 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.746567 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.746577 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.746591 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.746600 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.849938 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.849981 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.849993 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.850011 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.850024 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.855811 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.855862 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.855916 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.855959 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.856115 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.856118 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.856165 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:34 crc kubenswrapper[4731]: E1203 18:55:34.856223 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.952690 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.952774 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.952787 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.952804 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:34 crc kubenswrapper[4731]: I1203 18:55:34.952818 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:34Z","lastTransitionTime":"2025-12-03T18:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.055420 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.055495 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.055513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.055533 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.055549 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.158378 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.158474 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.158494 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.158520 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.158535 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.260550 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.260601 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.260628 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.260644 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.260652 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.364326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.364399 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.364417 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.364443 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.364462 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.474951 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.475060 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.475081 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.475144 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.475165 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.579009 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.579101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.579129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.579165 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.579185 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.682525 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.682621 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.682659 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.682690 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.682713 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.787210 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.787295 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.787312 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.787336 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.787354 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.857985 4731 scope.go:117] "RemoveContainer" containerID="130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.890231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.890553 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.890565 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.890582 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.890594 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.993165 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.993211 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.993226 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.993249 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:35 crc kubenswrapper[4731]: I1203 18:55:35.993293 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:35Z","lastTransitionTime":"2025-12-03T18:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.096695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.096746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.096764 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.096790 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.096812 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.200454 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.200511 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.200524 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.200549 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.200566 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.246546 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/1.log" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.249927 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.250511 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.274300 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.292613 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.303341 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.303390 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.303404 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.303428 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.303440 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.307543 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.321323 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.336937 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.351817 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.369312 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.393891 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.405405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.405449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.405462 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.405478 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.405489 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.413244 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.430648 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.442497 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.452853 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.470777 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.484737 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.496929 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.507658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.507709 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.507720 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.507736 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.507749 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.509568 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.519962 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.539078 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:36Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.609863 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.609909 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.609918 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.609933 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.609943 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.712403 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.712446 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.712457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.712472 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.712482 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.815159 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.815194 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.815204 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.815217 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.815227 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.855165 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:36 crc kubenswrapper[4731]: E1203 18:55:36.855322 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.855369 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.855399 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.855451 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:36 crc kubenswrapper[4731]: E1203 18:55:36.855501 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:36 crc kubenswrapper[4731]: E1203 18:55:36.855715 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:36 crc kubenswrapper[4731]: E1203 18:55:36.855809 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.917246 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.917314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.917326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.917341 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:36 crc kubenswrapper[4731]: I1203 18:55:36.917351 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:36Z","lastTransitionTime":"2025-12-03T18:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.020827 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.020872 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.020881 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.020902 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.020915 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.123818 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.123872 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.123920 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.123946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.123965 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.227143 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.227207 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.227249 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.227315 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.227334 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.329979 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.330038 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.330051 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.330064 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.330073 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.433217 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.433331 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.433354 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.433387 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.433412 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.536400 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.536459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.536476 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.536503 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.536521 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.639693 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.639966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.639977 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.639996 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.640005 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.743340 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.743427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.743447 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.743471 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.743485 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.845914 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.845972 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.845989 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.846013 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.846031 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.949437 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.949504 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.949523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.949547 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:37 crc kubenswrapper[4731]: I1203 18:55:37.949565 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:37Z","lastTransitionTime":"2025-12-03T18:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.054444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.054521 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.054543 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.054567 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.054585 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.157918 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.157955 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.157966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.157985 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.157998 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.260229 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.260293 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.260305 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.260321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.260332 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.307975 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/2.log" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.308910 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/1.log" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.311676 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" exitCode=1 Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.311724 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.311771 4731 scope.go:117] "RemoveContainer" containerID="130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.312802 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 18:55:38 crc kubenswrapper[4731]: E1203 18:55:38.313048 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.328190 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.347291 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.359579 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.362874 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.362923 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.362944 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.362975 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.362995 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.371655 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.390945 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.403986 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.414701 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.427790 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.447520 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.463913 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.466599 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.466655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.466667 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.466683 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.466696 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.478124 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.497385 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.513225 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.526110 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.537048 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.545286 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.562704 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.569838 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.569881 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.569894 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.569914 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.569929 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.572844 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:38Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.672152 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.672230 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.672285 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.672318 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.672345 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.774484 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.774522 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.774531 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.774545 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.774554 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.855214 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.855298 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:38 crc kubenswrapper[4731]: E1203 18:55:38.855427 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.855501 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.855529 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:38 crc kubenswrapper[4731]: E1203 18:55:38.855781 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:38 crc kubenswrapper[4731]: E1203 18:55:38.855723 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:38 crc kubenswrapper[4731]: E1203 18:55:38.855915 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.877619 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.877669 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.877686 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.877710 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.877728 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.980449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.980489 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.980498 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.980512 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:38 crc kubenswrapper[4731]: I1203 18:55:38.980520 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:38Z","lastTransitionTime":"2025-12-03T18:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.082960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.083004 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.083014 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.083029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.083038 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.186630 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.186676 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.186688 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.186706 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.186717 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.289550 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.289592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.289603 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.289617 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.289627 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.330670 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/2.log" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.393432 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.393479 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.393489 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.393508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.393518 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.496948 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.497059 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.497085 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.497174 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.497208 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.602960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.603054 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.603066 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.603101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.603112 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.706314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.706396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.706409 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.706432 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.706446 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.808645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.808694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.808703 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.808721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.808730 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.872143 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.891612 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.905892 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.911341 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.911395 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.911412 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.911435 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.911451 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:39Z","lastTransitionTime":"2025-12-03T18:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.926617 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.952035 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.971834 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:39 crc kubenswrapper[4731]: I1203 18:55:39.985804 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:39Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.006111 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.014594 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.014629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.014642 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.014658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.014670 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.019987 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.033713 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.045585 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.056582 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.074897 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.085452 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.101367 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.117298 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.117335 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.117346 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.117362 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.117373 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.118356 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.131410 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.143316 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:40Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.219886 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.219973 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.219985 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.220025 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.220047 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.322852 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.323442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.323461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.323493 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.323512 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.426611 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.426658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.426670 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.426688 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.426699 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.529239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.529314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.529326 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.529344 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.529355 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.631562 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.631616 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.631632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.631654 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.631670 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.733635 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.733681 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.733693 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.733710 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.733723 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.836965 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.837029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.837042 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.837059 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.837072 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.855760 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.855806 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.855968 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.855878 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:40 crc kubenswrapper[4731]: E1203 18:55:40.856220 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:40 crc kubenswrapper[4731]: E1203 18:55:40.856349 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:40 crc kubenswrapper[4731]: E1203 18:55:40.856393 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:40 crc kubenswrapper[4731]: E1203 18:55:40.856493 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.939891 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.939933 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.939945 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.939961 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:40 crc kubenswrapper[4731]: I1203 18:55:40.939971 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:40Z","lastTransitionTime":"2025-12-03T18:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.042970 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.043049 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.043067 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.043090 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.043102 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.145065 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.145099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.145108 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.145124 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.145133 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.247200 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.247265 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.247278 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.247294 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.247304 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.350022 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.350067 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.350077 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.350096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.350108 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.452719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.452785 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.452803 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.452826 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.452844 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.555086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.555151 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.555175 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.555204 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.555225 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.657529 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.657608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.657632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.657663 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.657690 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.760025 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.760089 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.760099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.760119 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.760129 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.862864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.863124 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.863145 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.863170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.863193 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.965088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.965169 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.965193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.965222 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:41 crc kubenswrapper[4731]: I1203 18:55:41.965244 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:41Z","lastTransitionTime":"2025-12-03T18:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.067635 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.067690 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.067701 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.067721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.067732 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.170717 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.170782 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.170802 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.170829 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.170892 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.274482 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.274529 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.274576 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.274602 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.274619 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.357147 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.357239 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.357296 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.357329 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.357350 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.373887 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:42Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.379749 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.379828 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.379847 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.379878 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.379919 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.400733 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:42Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.405790 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.405850 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.405871 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.405902 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.405923 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.423239 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:42Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.427384 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.427461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.427473 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.427518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.427533 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.447336 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:42Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.452763 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.452875 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.452895 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.452922 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.452942 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.477057 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:42Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.477240 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.495016 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.495080 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.495096 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.495116 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.495127 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.598008 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.598109 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.598130 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.598162 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.598181 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.702115 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.702196 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.702220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.702285 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.702310 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.805505 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.805552 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.805564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.805591 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.805605 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.855436 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.855522 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.855522 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.855529 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.855635 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.855810 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.855964 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:42 crc kubenswrapper[4731]: E1203 18:55:42.856144 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.908566 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.908614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.908624 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.908645 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:42 crc kubenswrapper[4731]: I1203 18:55:42.908661 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:42Z","lastTransitionTime":"2025-12-03T18:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.011798 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.011835 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.011844 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.011861 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.011871 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.115882 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.115975 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.115993 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.116021 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.116037 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.219002 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.219048 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.219059 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.219083 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.219095 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.321554 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.321605 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.321619 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.321638 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.321650 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.424442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.424518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.424535 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.424565 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.424587 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.527907 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.527951 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.527966 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.527989 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.528005 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.631247 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.631328 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.631340 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.631359 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.631370 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.734425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.734467 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.734476 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.734489 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.734499 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.837409 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.837480 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.837488 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.837505 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.837517 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.940312 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.940354 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.940363 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.940379 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:43 crc kubenswrapper[4731]: I1203 18:55:43.940388 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:43Z","lastTransitionTime":"2025-12-03T18:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.043656 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.043694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.043704 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.043719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.043729 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.146021 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.146061 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.146071 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.146091 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.146107 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.255029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.255076 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.255088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.255105 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.255120 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.357466 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.357508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.357518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.357554 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.357563 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.459464 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.459501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.459512 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.459527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.459538 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.561309 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.561376 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.561390 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.561405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.561415 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.663465 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.663522 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.663533 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.663551 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.663561 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.766871 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.766931 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.766941 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.766957 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.766966 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.855420 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.855466 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.855499 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:44 crc kubenswrapper[4731]: E1203 18:55:44.855560 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:44 crc kubenswrapper[4731]: E1203 18:55:44.855680 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.855739 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:44 crc kubenswrapper[4731]: E1203 18:55:44.855789 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:44 crc kubenswrapper[4731]: E1203 18:55:44.855806 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.869302 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.869343 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.869356 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.869376 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.869418 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.971655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.971699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.971709 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.971726 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:44 crc kubenswrapper[4731]: I1203 18:55:44.971736 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:44Z","lastTransitionTime":"2025-12-03T18:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.080345 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.080385 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.080396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.080414 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.080425 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.182694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.182751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.182765 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.182786 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.182799 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.285647 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.285686 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.285695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.285709 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.285718 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.388235 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.388294 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.388303 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.388321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.388330 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.490924 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.491008 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.491031 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.491055 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.491073 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.593073 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.593118 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.593129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.593146 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.593156 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.695046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.695084 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.695094 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.695112 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.695123 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.798075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.798114 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.798125 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.798141 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.798152 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.902015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.902067 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.902083 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.902362 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:45 crc kubenswrapper[4731]: I1203 18:55:45.902396 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:45Z","lastTransitionTime":"2025-12-03T18:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.006415 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.006466 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.006499 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.006519 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.006534 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.108518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.108566 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.108581 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.108602 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.108617 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.211337 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.211382 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.211392 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.211409 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.211423 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.313978 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.314016 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.314026 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.314041 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.314051 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.416592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.416671 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.416695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.416729 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.416751 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.518684 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.518735 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.518746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.518759 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.518768 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.620890 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.620949 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.620961 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.620978 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.620990 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.723110 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.723189 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.723204 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.723220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.723232 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.825867 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.825910 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.825923 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.825939 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.825951 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.855665 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.855744 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.855679 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.855679 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:46 crc kubenswrapper[4731]: E1203 18:55:46.855907 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:46 crc kubenswrapper[4731]: E1203 18:55:46.856073 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:46 crc kubenswrapper[4731]: E1203 18:55:46.856175 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:46 crc kubenswrapper[4731]: E1203 18:55:46.856287 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.928568 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.928602 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.928612 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.928626 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:46 crc kubenswrapper[4731]: I1203 18:55:46.928636 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:46Z","lastTransitionTime":"2025-12-03T18:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.030480 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.030517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.030526 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.030541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.030551 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.132915 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.132953 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.132962 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.132978 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.132988 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.234883 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.234927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.234938 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.234954 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.234966 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.338047 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.338107 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.338122 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.338145 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.338161 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.441344 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.441386 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.441395 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.441410 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.441421 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.543590 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.543640 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.543652 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.543669 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.543681 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.649181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.649217 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.649226 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.649242 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.649270 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.751853 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.751900 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.751911 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.751928 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.751938 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.855093 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.855158 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.855171 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.855193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.855212 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.957609 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.957651 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.957663 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.957680 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:47 crc kubenswrapper[4731]: I1203 18:55:47.957692 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:47Z","lastTransitionTime":"2025-12-03T18:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.060052 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.060104 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.060119 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.060140 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.060155 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.163461 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.163501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.163510 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.163523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.163532 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.265699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.265730 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.265740 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.265753 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.265764 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.368299 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.368366 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.368375 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.368411 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.368423 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.470882 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.470960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.470975 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.470995 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.471018 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.573665 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.573712 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.573721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.573737 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.573747 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.675982 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.676020 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.676032 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.676046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.676057 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.778410 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.778460 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.778470 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.778488 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.778501 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.856080 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.856127 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.856154 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.856105 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:48 crc kubenswrapper[4731]: E1203 18:55:48.856235 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:48 crc kubenswrapper[4731]: E1203 18:55:48.856322 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:48 crc kubenswrapper[4731]: E1203 18:55:48.856492 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:48 crc kubenswrapper[4731]: E1203 18:55:48.856757 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.880563 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.880596 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.880604 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.880618 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.880627 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.982581 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.982620 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.982636 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.982657 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:48 crc kubenswrapper[4731]: I1203 18:55:48.982670 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:48Z","lastTransitionTime":"2025-12-03T18:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.084648 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.084698 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.084709 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.084727 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.084739 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.187059 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.187177 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.187196 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.187221 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.187238 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.290718 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.290761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.290770 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.290786 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.290795 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.394611 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.394699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.394719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.394742 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.394789 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.498707 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.498754 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.498764 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.498780 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.498791 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.602741 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.602825 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.602850 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.602883 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.602906 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.706438 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.706481 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.706491 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.706507 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.706517 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.809685 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.809739 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.809750 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.809766 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.809776 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.870421 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.888222 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.899672 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.911911 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.911916 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.911962 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.912082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.912216 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.912246 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:49Z","lastTransitionTime":"2025-12-03T18:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.925702 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.940828 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.952496 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.975189 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:49 crc kubenswrapper[4731]: I1203 18:55:49.989155 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.002326 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:49Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.013565 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.014364 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.014406 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.014439 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.014457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.014469 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.023319 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.039455 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://130ea596c62034c8b9f37f6cd2de2a45a77524e25d6fcc3526b03d0f477bbcff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"message\\\":\\\"achine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 18:55:19.421930 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 18:55:19.4219\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.049092 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.059170 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.068877 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.079738 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.089047 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.116940 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.116981 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.116993 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.117009 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.117021 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.218758 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.218803 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.218817 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.218837 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.218850 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.321710 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.321752 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.321761 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.321776 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.321790 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.423833 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.423872 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.423881 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.423895 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.423905 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.526210 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.526283 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.526295 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.526312 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.526323 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.628980 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.629030 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.629041 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.629061 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.629075 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.732129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.732171 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.732180 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.732195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.732205 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.742081 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.742241 4731 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.742670 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs podName:83957f97-f30b-4ea7-8849-c7264d61fd52 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:22.742643375 +0000 UTC m=+103.341237879 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs") pod "network-metrics-daemon-p6zls" (UID: "83957f97-f30b-4ea7-8849-c7264d61fd52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.834792 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.835091 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.835195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.835315 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.835444 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.855475 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.855576 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.855596 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.855723 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.855507 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.856149 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.856227 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.856539 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.857659 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 18:55:50 crc kubenswrapper[4731]: E1203 18:55:50.857963 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.870626 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.881987 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.895002 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.906828 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.925850 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.937579 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.937622 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.937631 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.937647 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.937657 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:50Z","lastTransitionTime":"2025-12-03T18:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.941395 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.953668 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.965429 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.977478 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.989050 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:50 crc kubenswrapper[4731]: I1203 18:55:50.999101 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:50Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.011481 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.020512 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.037649 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.040558 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.040600 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.040614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.040630 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.040640 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.048436 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.066295 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.079753 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.091811 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:51Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.142521 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.142713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.142827 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.142916 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.142988 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.245554 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.245784 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.245907 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.246001 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.246077 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.348455 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.348750 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.348842 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.348916 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.348989 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.451275 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.451317 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.451328 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.451342 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.451353 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.553592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.553655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.553666 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.553684 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.553696 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.655903 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.655930 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.655938 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.655951 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.655961 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.758445 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.758482 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.758490 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.758505 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.758514 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.860019 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.860059 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.860071 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.860084 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.860095 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.962429 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.962718 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.962779 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.962846 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:51 crc kubenswrapper[4731]: I1203 18:55:51.962907 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:51Z","lastTransitionTime":"2025-12-03T18:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.065563 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.065844 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.065926 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.066007 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.066065 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.168118 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.168183 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.168197 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.168212 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.168223 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.271093 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.271408 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.271496 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.271615 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.271745 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.374091 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.374136 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.374145 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.374161 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.374173 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.376433 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/0.log" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.376477 4731 generic.go:334] "Generic (PLEG): container finished" podID="4ee4f887-8ce3-42c9-9886-06bdf109800c" containerID="200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887" exitCode=1 Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.376501 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerDied","Data":"200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.376841 4731 scope.go:117] "RemoveContainer" containerID="200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.394021 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.408754 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.420870 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.444348 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.460563 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.476580 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.476637 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.476648 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.476735 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.476747 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.477901 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.490984 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508028 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508245 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508438 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508500 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.508890 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.521756 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.522863 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"2025-12-03T18:55:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726\\\\n2025-12-03T18:55:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726 to /host/opt/cni/bin/\\\\n2025-12-03T18:55:07Z [verbose] multus-daemon started\\\\n2025-12-03T18:55:07Z [verbose] Readiness Indicator file check\\\\n2025-12-03T18:55:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.525541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.525568 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.525576 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.525589 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.525598 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.535863 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.538628 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.542053 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.542213 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.542313 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.542407 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.542500 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.548618 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.554415 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.558439 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.558513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.558523 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.558536 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.558545 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.560808 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.570516 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.573515 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.573555 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.573566 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.573582 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.573591 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.585761 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.588844 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.588979 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.592751 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.592787 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.592800 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.592816 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.592829 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.598161 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.620276 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.632628 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.648550 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.663890 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:52Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.695658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.695689 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.695699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.695713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.695722 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.797372 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.797407 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.797419 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.797434 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.797446 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.855615 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.855803 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.855654 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.855663 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.855811 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.856312 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.856335 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:52 crc kubenswrapper[4731]: E1203 18:55:52.856471 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.900274 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.900311 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.900321 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.900337 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:52 crc kubenswrapper[4731]: I1203 18:55:52.900348 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:52Z","lastTransitionTime":"2025-12-03T18:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.002820 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.002869 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.002881 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.002901 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.002912 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.105644 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.105712 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.105724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.105759 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.105777 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.208807 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.208870 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.208911 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.208940 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.208963 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.311662 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.311692 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.311704 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.311719 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.311730 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.380761 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/0.log" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.380823 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerStarted","Data":"99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.393824 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.405767 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.414251 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.414314 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.414328 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.414348 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.414362 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.416635 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.427073 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.440883 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.458489 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.471480 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"2025-12-03T18:55:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726\\\\n2025-12-03T18:55:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726 to /host/opt/cni/bin/\\\\n2025-12-03T18:55:07Z [verbose] multus-daemon started\\\\n2025-12-03T18:55:07Z [verbose] Readiness Indicator file check\\\\n2025-12-03T18:55:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.487767 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.502137 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.512741 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.516303 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.516352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.516385 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.516405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.516416 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.523028 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.535699 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.547764 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.562590 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.574795 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.599308 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.613405 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.618296 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.618324 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.618332 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.618346 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.618355 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.634964 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:53Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.720425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.720512 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.720527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.720545 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.720555 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.824389 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.824434 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.824442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.824458 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.824468 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.926681 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.926724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.926739 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.926755 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:53 crc kubenswrapper[4731]: I1203 18:55:53.926765 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:53Z","lastTransitionTime":"2025-12-03T18:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.030043 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.030083 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.030093 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.030121 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.030134 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.132103 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.132139 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.132149 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.132167 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.132177 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.235063 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.235107 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.235120 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.235141 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.235152 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.338280 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.338323 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.338336 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.338352 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.338363 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.440980 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.441034 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.441050 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.441070 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.441087 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.543583 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.543632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.543644 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.543661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.543685 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.646562 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.646606 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.646617 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.646632 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.646642 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.749561 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.749626 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.749643 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.749672 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.749696 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.852564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.852606 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.852614 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.852629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.852641 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.856012 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.856065 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.856038 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.856023 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:54 crc kubenswrapper[4731]: E1203 18:55:54.856146 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:54 crc kubenswrapper[4731]: E1203 18:55:54.856221 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:54 crc kubenswrapper[4731]: E1203 18:55:54.856334 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:54 crc kubenswrapper[4731]: E1203 18:55:54.856434 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.954708 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.954748 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.954756 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.954777 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:54 crc kubenswrapper[4731]: I1203 18:55:54.954790 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:54Z","lastTransitionTime":"2025-12-03T18:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.056864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.056911 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.056922 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.056944 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.056958 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.158957 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.158999 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.159015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.159030 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.159041 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.261424 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.261482 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.261494 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.261511 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.261523 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.364191 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.364232 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.364243 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.364284 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.364299 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.467122 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.467160 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.467169 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.467181 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.467190 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.569953 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.570033 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.570048 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.570066 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.570084 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.672715 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.672758 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.672770 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.672785 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.672802 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.776155 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.776202 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.776218 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.776241 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.776314 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.878447 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.878496 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.878508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.878527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.878539 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.981459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.981498 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.981508 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.981524 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:55 crc kubenswrapper[4731]: I1203 18:55:55.981535 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:55Z","lastTransitionTime":"2025-12-03T18:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.083440 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.083491 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.083501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.083517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.083526 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.186735 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.186813 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.186837 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.186867 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.186887 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.289368 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.289440 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.289458 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.289496 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.289513 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.391407 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.391477 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.391501 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.391532 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.391552 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.494026 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.494072 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.494088 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.494105 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.494115 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.597520 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.597555 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.597565 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.597582 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.597593 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.699564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.700198 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.700305 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.700433 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.700550 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.802842 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.802919 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.802938 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.802957 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.803005 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.855113 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.855143 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.855149 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.855123 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:56 crc kubenswrapper[4731]: E1203 18:55:56.855223 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:56 crc kubenswrapper[4731]: E1203 18:55:56.855310 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:56 crc kubenswrapper[4731]: E1203 18:55:56.855381 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:56 crc kubenswrapper[4731]: E1203 18:55:56.855576 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.905354 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.905396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.905408 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.905425 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:56 crc kubenswrapper[4731]: I1203 18:55:56.905435 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:56Z","lastTransitionTime":"2025-12-03T18:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.008322 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.008382 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.008391 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.008405 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.008415 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.111796 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.111840 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.111852 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.111870 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.111882 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.214767 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.214844 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.214854 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.214868 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.214879 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.317058 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.317099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.317110 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.317129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.317144 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.419857 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.419899 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.419911 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.419927 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.419937 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.522213 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.522309 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.522334 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.522358 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.522374 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.625530 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.625592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.625612 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.625636 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.625653 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.727742 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.727819 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.727842 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.727872 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.727895 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.830121 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.830184 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.830203 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.830231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.830250 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.933280 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.933337 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.933349 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.933365 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:57 crc kubenswrapper[4731]: I1203 18:55:57.933378 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:57Z","lastTransitionTime":"2025-12-03T18:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.036099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.036144 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.036156 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.036172 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.036184 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.138604 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.138681 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.138695 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.138713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.138725 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.244118 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.244175 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.244195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.244245 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.244303 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.346484 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.346546 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.346560 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.346577 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.346588 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.449598 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.449648 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.449658 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.449675 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.449683 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.552917 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.552972 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.552987 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.553007 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.553021 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.655534 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.655593 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.655603 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.655617 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.655627 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.757661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.757693 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.757701 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.757713 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.757723 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.855014 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.855045 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:55:58 crc kubenswrapper[4731]: E1203 18:55:58.855185 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.855225 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.855242 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:55:58 crc kubenswrapper[4731]: E1203 18:55:58.855403 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:55:58 crc kubenswrapper[4731]: E1203 18:55:58.855602 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:55:58 crc kubenswrapper[4731]: E1203 18:55:58.855702 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.860444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.860499 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.860517 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.860541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.860558 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.963511 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.963579 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.963598 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.963624 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:58 crc kubenswrapper[4731]: I1203 18:55:58.963645 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:58Z","lastTransitionTime":"2025-12-03T18:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.067419 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.067477 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.067496 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.067521 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.067541 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.170285 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.170339 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.170355 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.170380 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.170395 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.272779 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.272822 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.272830 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.272845 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.272857 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.375537 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.375592 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.375603 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.375627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.375641 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.477655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.477696 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.477710 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.477726 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.477736 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.579801 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.579834 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.579842 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.579857 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.579867 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.681402 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.681436 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.681445 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.681458 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.681467 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.783041 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.783075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.783087 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.783101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.783112 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.874960 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9aa3f55-0f87-4674-baa9-6a8352acc19b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01e90d484f95985ec1cc9173880ee61ea08c57d7ec88ad06830ac1d05c0d89c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f2beac3dfd32d1c40bab439726f0beca4b861d2c2b79610397580a547d1806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://648c8c130807ec2a4b674893a9926daacaaa5e9d6b10b417e625214bbd7bbe05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88ae9c21ec68db4cf0ca247b2fa4c9f640f794f4b19cfd38a9fc5807dfa54e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a63e58f563cbbc866edc5b0cd96bb2f979454a2a287ee764ae38d4d8cae722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda2dfe939acee2bdb1e0c885d8e76cdfb9c84f7ceb484be31e9a2d03d42e1fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf07c93ea87b5820cd2bf65c02cfbf7ee7f0d19d9128f6e0f43c255ac119eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8baf200ee42ecc8e8f71b9f499f67f9428d08355f74714e5fb51c8d3eb5c6b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.886939 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.886965 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.886974 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.886987 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.887021 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.888844 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f7990727a4abc772fb4f260a6e3feb0e26cc29476a1b1b73470813166027d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.899758 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6591373190bbcf4d0025f8b04208a3735eb9cf0abf5438e393c58baca767fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.910441 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.920392 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzldf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b514e547-82a6-4f87-8879-aea6f4cd653d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9df75223b60f35dcadd2f3c6d9dc811b6f03623227b14813ecfde35ead976dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhdtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzldf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.938157 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2676769f-27dd-4ac2-9398-7322817ce55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:37Z\\\",\\\"message\\\":\\\"nsuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1203 18:55:36.764912 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-x7zbk after 0 failed attempt(s)\\\\nI1203 18:55:36.764946 6441 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-x7zbk\\\\nI1203 18:55:36.764952 6441 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1203 18:55:36.764965 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764977 6441 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd\\\\nI1203 18:55:36.764985 6441 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-mmjcd in node crc\\\\nI1203 18:55:36.764992 6441 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-mmjcd after 0 failed attempt(s)\\\\nI1203 18:55:36.764995 6441 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 18:55:36.764998 6441\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xcsvg_openshift-ovn-kubernetes(2676769f-27dd-4ac2-9398-7322817ce55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsjcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xcsvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.948758 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p6zls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83957f97-f30b-4ea7-8849-c7264d61fd52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p6zls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.960149 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6066ce05-b7e9-410d-ab21-7e8d01e8a271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68af97d7573fa316f046515262408bbd8e0c5931c2d343ad0bf1ecb2ffa1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df09abac4728553a28cbb7f13d5e371c03e5a3ac731fe1a65dc05f4d9614f448\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97086497b1800f711f50889e0a0b1a560485dc2af5ca2699c7ea5bfdab10a806\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.975541 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.990513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.990586 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.990602 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.990624 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.990661 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:55:59Z","lastTransitionTime":"2025-12-03T18:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:55:59 crc kubenswrapper[4731]: I1203 18:55:59.994323 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b805df0495eff39d1468e8391df5729f0e95abc8bcbe33dea783e5f550b4f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a831d5bdec1a68f3f1d4769f13f95e503f56c15a92a1d802c82828449f285a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:55:59Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.004355 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gkw94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c0c2a2a-2759-494b-b107-06d0367eb3ed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7a791486b130b30546c24a150fedf4a66cf5da5fdba8cea07e1d61c7a28048\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvskp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gkw94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.016069 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0056941-56ab-4b8a-a25b-5fb8a83c9fb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\" (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 18:54:52.506060 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 18:54:52.508349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-5891195/tls.crt::/tmp/serving-cert-5891195/tls.key\\\\\\\"\\\\nI1203 18:54:57.937371 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 18:54:57.941640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 18:54:57.941706 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 18:54:57.941750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 18:54:57.941765 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 18:54:57.952491 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 18:54:57.952549 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952559 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 18:54:57.952570 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 18:54:57.952577 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 18:54:57.952584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 18:54:57.952591 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 18:54:57.952623 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 18:54:57.956312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.034226 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.047240 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95dced4d-3fd5-43d3-b87d-21ec9c80de8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50bddddf2090a0d2a22c3e2e745a53c2771a740c850be4dbced0c18b87255b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvzsp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mmjcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.061960 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7zbk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee4f887-8ce3-42c9-9886-06bdf109800c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T18:55:52Z\\\",\\\"message\\\":\\\"2025-12-03T18:55:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726\\\\n2025-12-03T18:55:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bddabd57-4410-44e2-a2c5-083171082726 to /host/opt/cni/bin/\\\\n2025-12-03T18:55:07Z [verbose] multus-daemon started\\\\n2025-12-03T18:55:07Z [verbose] Readiness Indicator file check\\\\n2025-12-03T18:55:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kbmfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7zbk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.078623 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-clrw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67102b0b-f85c-470b-8ba1-14eedddebae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8887150fbd7b781d6d419cbe1abcdb8c54b4f8774b96f90232490addb914b1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://867fe213ab8f8973a37dea1b35f9a8d47179dfe7bfe5ca250f5d8856856ddbfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e9e0138f299015b0f819c5fd0d675e9fbd77400e0cf1cddc2f21d499c990be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a222d4dc6d554e149bfabb86f57fe39aff7ceea469abe598e41b6b5c1571ec2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4782634286ff6aa84e21d29ec714bc2fc794d03a7e8e1b35f1e8aee49bd8cb70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dd8300e97c52cba9a18db77fac04cc1a0b194fa62d495e2c7f5a2bd5193655\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://200e5cd1be9712fa61940b0bfa25452c01dc5c454605a516e0be7e174f2d30c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5v6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-clrw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.092725 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df224548-ac65-4d57-973b-dd8c18d5992d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8fd05775110bd2999b96671dd8ebe171a4dac00d3860cb5557e5cd9b93e36a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d03e61f7db09f22355f9a919f97fd769ab5b04d868b2bf93a41672b433320ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c41df47675afe5057b8bfac3153b7d70529f2e073c942b167bce92a7ff96a17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b863d13bf1403f13a2c2b059c0aa0365109b5c376ddb4e3ce9a0816a096e0605\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T18:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T18:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:54:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.093297 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.093328 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.093339 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.093357 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.093369 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.105927 4731 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33e88da1-3764-46ce-a91f-fe154f3a9dfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T18:55:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b1952b30fcc64a9537b67c61ac656a1c43d97c1e75fcb2c7601ec033377e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f7ea0010a2f64c9d1edb3fc7f80d59f266b21428e1d53d747aa747ca5875ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T18:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2cxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T18:55:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-frxlb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:00Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.196540 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.196832 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.196902 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.196988 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.197077 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.300325 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.300407 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.300420 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.300442 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.300454 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.402538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.402866 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.403070 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.403231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.403437 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.506450 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.506504 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.506521 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.506545 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.506563 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.609378 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.609418 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.609432 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.609448 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.609460 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.711946 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.712029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.712051 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.712075 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.712092 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.814655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.814705 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.814724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.814745 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.814758 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.856043 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.856096 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.856068 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.856109 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:00 crc kubenswrapper[4731]: E1203 18:56:00.856445 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:56:00 crc kubenswrapper[4731]: E1203 18:56:00.856589 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:00 crc kubenswrapper[4731]: E1203 18:56:00.856632 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:56:00 crc kubenswrapper[4731]: E1203 18:56:00.856719 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.869864 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.917178 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.917212 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.917225 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.917242 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:00 crc kubenswrapper[4731]: I1203 18:56:00.917279 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:00Z","lastTransitionTime":"2025-12-03T18:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.019166 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.019202 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.019210 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.019225 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.019236 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.121933 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.121965 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.121974 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.121986 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.121994 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.223931 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.223981 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.223991 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.224005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.224015 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.326547 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.326593 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.326608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.326627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.326639 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.429227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.429287 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.429299 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.429315 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.429326 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.531741 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.531798 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.531812 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.531833 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.531845 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.634545 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.634610 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.634627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.634651 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.634669 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.737375 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.737414 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.737427 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.737444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.737457 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.840193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.840306 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.840331 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.840511 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.840536 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.946868 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.946950 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.946968 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.946994 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:01 crc kubenswrapper[4731]: I1203 18:56:01.947012 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:01Z","lastTransitionTime":"2025-12-03T18:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.049687 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.049758 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.049770 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.049786 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.049797 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.151948 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.152011 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.152024 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.152039 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.152050 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.255017 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.255076 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.255086 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.255101 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.255111 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.357768 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.357813 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.357824 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.357841 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.357857 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.460585 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.460629 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.460655 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.460668 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.460678 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.562824 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.562858 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.562865 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.562878 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.562927 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.665441 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.665479 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.665490 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.665507 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.665518 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.761927 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762030 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:57:06.762009792 +0000 UTC m=+147.360604256 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.762077 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.762104 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.762127 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.762145 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762203 4731 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762239 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:57:06.76223237 +0000 UTC m=+147.360826834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762244 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762274 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762286 4731 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762311 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 18:57:06.762303393 +0000 UTC m=+147.360897857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762433 4731 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762464 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762547 4731 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762580 4731 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762556 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 18:57:06.762534811 +0000 UTC m=+147.361129295 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.762731 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 18:57:06.762673916 +0000 UTC m=+147.361268400 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.768170 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.768220 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.768235 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.768278 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.768295 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.818642 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.818704 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.818715 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.818738 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.818751 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.831704 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:02Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.836019 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.836064 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.836079 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.836099 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.836111 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.848855 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:02Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.852028 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.852069 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.852080 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.852097 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.852108 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.855810 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.855809 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.855841 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.855933 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.856082 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.856180 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.856365 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.856497 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.865998 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:02Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.870112 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.870160 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.870174 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.870196 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.870212 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.885444 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:02Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.889228 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.889274 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.889301 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.889317 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.889327 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.901414 4731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T18:56:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0391d0b7-6bd6-4f67-80fe-4f045ea8f8ff\\\",\\\"systemUUID\\\":\\\"f0d786ec-d814-4b2d-8cec-fe62d92000dd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T18:56:02Z is after 2025-08-24T17:21:41Z" Dec 03 18:56:02 crc kubenswrapper[4731]: E1203 18:56:02.901564 4731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.903227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.903278 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.903287 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.903302 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:02 crc kubenswrapper[4731]: I1203 18:56:02.903313 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:02Z","lastTransitionTime":"2025-12-03T18:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.005687 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.005728 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.005745 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.005789 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.005823 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.107928 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.107978 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.107989 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.108009 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.108024 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.210882 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.210925 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.210940 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.210956 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.210968 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.313868 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.313913 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.313926 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.313942 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.313954 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.416928 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.416968 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.416978 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.416992 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.417001 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.519932 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.519975 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.519986 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.520003 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.520016 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.622973 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.623032 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.623046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.623064 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.623076 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.726358 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.726402 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.726411 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.726426 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.726436 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.828900 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.828952 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.828964 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.828983 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.828995 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.932129 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.932184 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.932195 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.932209 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:03 crc kubenswrapper[4731]: I1203 18:56:03.932219 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:03Z","lastTransitionTime":"2025-12-03T18:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.035584 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.035661 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.035683 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.035712 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.035734 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.139142 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.139209 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.139227 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.139285 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.139305 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.242124 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.242201 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.242225 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.242294 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.242324 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.345402 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.345462 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.345479 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.345504 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.345521 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.447924 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.447996 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.448015 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.448039 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.448056 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.550656 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.550699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.550711 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.550728 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.550739 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.653449 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.653513 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.653535 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.653560 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.653579 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.756692 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.756746 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.756767 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.756793 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.756813 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.855515 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.855578 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.855578 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.855640 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:04 crc kubenswrapper[4731]: E1203 18:56:04.855724 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:04 crc kubenswrapper[4731]: E1203 18:56:04.855865 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:56:04 crc kubenswrapper[4731]: E1203 18:56:04.855918 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:56:04 crc kubenswrapper[4731]: E1203 18:56:04.856008 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.859819 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.859869 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.859888 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.859914 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.859933 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.962494 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.963068 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.963082 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.963109 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:04 crc kubenswrapper[4731]: I1203 18:56:04.963120 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:04Z","lastTransitionTime":"2025-12-03T18:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.066384 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.066443 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.066459 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.066484 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.066503 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.169838 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.169904 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.169920 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.169943 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.169959 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.273027 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.273090 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.273112 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.273140 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.273164 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.375864 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.375916 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.376114 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.376133 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.376145 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.479346 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.479396 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.479413 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.479435 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.479450 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.581724 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.581796 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.581807 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.581824 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.581838 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.684514 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.684549 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.684557 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.684570 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.684581 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.786445 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.786500 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.786518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.786538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.786552 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.856349 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.889092 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.889147 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.889164 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.889185 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.889200 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.991943 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.992017 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.992028 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.992050 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:05 crc kubenswrapper[4731]: I1203 18:56:05.992062 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:05Z","lastTransitionTime":"2025-12-03T18:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.094733 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.094781 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.094793 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.094810 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.094839 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.197377 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.197444 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.197533 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.197581 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.197603 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.301212 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.301251 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.301279 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.301293 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.301302 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.403457 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.403491 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.403499 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.403511 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.403519 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.506434 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.506481 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.506492 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.506509 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.506520 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.608744 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.608774 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.608782 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.608795 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.608803 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.711721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.712043 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.712052 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.712067 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.712075 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.814594 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.814639 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.814649 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.814668 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.814681 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.856023 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:06 crc kubenswrapper[4731]: E1203 18:56:06.856140 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.856327 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:06 crc kubenswrapper[4731]: E1203 18:56:06.856375 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.856472 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:06 crc kubenswrapper[4731]: E1203 18:56:06.856515 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.856611 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:06 crc kubenswrapper[4731]: E1203 18:56:06.856662 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.916739 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.916786 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.916799 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.916814 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:06 crc kubenswrapper[4731]: I1203 18:56:06.916825 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:06Z","lastTransitionTime":"2025-12-03T18:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.019169 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.019212 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.019224 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.019243 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.019274 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.121587 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.121652 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.121660 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.121674 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.121684 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.223805 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.223843 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.223852 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.223866 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.223875 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.231709 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p6zls"] Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.326003 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.326035 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.326046 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.326063 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.326075 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.425943 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/2.log" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.427593 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.427818 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.427960 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.428100 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.428218 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.428725 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerStarted","Data":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.428745 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:07 crc kubenswrapper[4731]: E1203 18:56:07.428992 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.429551 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.453991 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=69.453970045 podStartE2EDuration="1m9.453970045s" podCreationTimestamp="2025-12-03 18:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.453649794 +0000 UTC m=+88.052244278" watchObservedRunningTime="2025-12-03 18:56:07.453970045 +0000 UTC m=+88.052564509" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.522882 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podStartSLOduration=63.522864262 podStartE2EDuration="1m3.522864262s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.522396066 +0000 UTC m=+88.120990540" watchObservedRunningTime="2025-12-03 18:56:07.522864262 +0000 UTC m=+88.121458736" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.523146 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hzldf" podStartSLOduration=63.523138412 podStartE2EDuration="1m3.523138412s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.502104796 +0000 UTC m=+88.100699280" watchObservedRunningTime="2025-12-03 18:56:07.523138412 +0000 UTC m=+88.121732886" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.530947 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.530981 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.530990 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.531005 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.531016 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.545995 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.545977433 podStartE2EDuration="7.545977433s" podCreationTimestamp="2025-12-03 18:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.545714164 +0000 UTC m=+88.144308628" watchObservedRunningTime="2025-12-03 18:56:07.545977433 +0000 UTC m=+88.144571887" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.560793 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.560771455 podStartE2EDuration="1m6.560771455s" podCreationTimestamp="2025-12-03 18:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.560688102 +0000 UTC m=+88.159282566" watchObservedRunningTime="2025-12-03 18:56:07.560771455 +0000 UTC m=+88.159365919" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.601390 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gkw94" podStartSLOduration=63.601372725 podStartE2EDuration="1m3.601372725s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.601297992 +0000 UTC m=+88.199892456" watchObservedRunningTime="2025-12-03 18:56:07.601372725 +0000 UTC m=+88.199967189" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.615604 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.615588036 podStartE2EDuration="1m9.615588036s" podCreationTimestamp="2025-12-03 18:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.615513973 +0000 UTC m=+88.214108447" watchObservedRunningTime="2025-12-03 18:56:07.615588036 +0000 UTC m=+88.214182500" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.633192 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.633244 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.633288 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.633310 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.633322 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.643571 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podStartSLOduration=63.643551011 podStartE2EDuration="1m3.643551011s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.643293312 +0000 UTC m=+88.241887786" watchObservedRunningTime="2025-12-03 18:56:07.643551011 +0000 UTC m=+88.242145475" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.664213 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x7zbk" podStartSLOduration=63.664192374 podStartE2EDuration="1m3.664192374s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.66409271 +0000 UTC m=+88.262687184" watchObservedRunningTime="2025-12-03 18:56:07.664192374 +0000 UTC m=+88.262786838" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.682707 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-clrw9" podStartSLOduration=63.682690418 podStartE2EDuration="1m3.682690418s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.681998944 +0000 UTC m=+88.280593428" watchObservedRunningTime="2025-12-03 18:56:07.682690418 +0000 UTC m=+88.281284872" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.701030 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.701011177 podStartE2EDuration="39.701011177s" podCreationTimestamp="2025-12-03 18:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.69998062 +0000 UTC m=+88.298575094" watchObservedRunningTime="2025-12-03 18:56:07.701011177 +0000 UTC m=+88.299605661" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.736193 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.736231 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.736244 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.736276 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.736289 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.838679 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.838722 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.838732 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.838750 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.838763 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.941366 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.941408 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.941416 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.941434 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:07 crc kubenswrapper[4731]: I1203 18:56:07.941443 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:07Z","lastTransitionTime":"2025-12-03T18:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.043716 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.044574 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.044671 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.044759 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.044836 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.147694 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.147740 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.147750 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.147765 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.147773 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.250506 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.250559 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.250573 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.250595 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.250622 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.353608 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.353646 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.353660 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.353678 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.353690 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.455226 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.455518 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.455527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.455541 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.455551 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.557599 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.557627 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.557634 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.557647 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.557655 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.660527 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.660564 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.660574 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.660589 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.660599 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.762886 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.762923 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.762931 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.762945 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.762956 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.855171 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.855302 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.855314 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.855328 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:08 crc kubenswrapper[4731]: E1203 18:56:08.855419 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 18:56:08 crc kubenswrapper[4731]: E1203 18:56:08.855567 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 18:56:08 crc kubenswrapper[4731]: E1203 18:56:08.855627 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p6zls" podUID="83957f97-f30b-4ea7-8849-c7264d61fd52" Dec 03 18:56:08 crc kubenswrapper[4731]: E1203 18:56:08.855684 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.864670 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.864699 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.864708 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.864721 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.864730 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.967470 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.967524 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.967538 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.967558 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:08 crc kubenswrapper[4731]: I1203 18:56:08.967575 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:08Z","lastTransitionTime":"2025-12-03T18:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.070543 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.070593 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.070605 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.070622 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.070634 4731 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T18:56:09Z","lastTransitionTime":"2025-12-03T18:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.172979 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.173014 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.173029 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.173043 4731 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.173159 4731 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.204287 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-frxlb" podStartSLOduration=64.20423089 podStartE2EDuration="1m4.20423089s" podCreationTimestamp="2025-12-03 18:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:07.711027297 +0000 UTC m=+88.309621761" watchObservedRunningTime="2025-12-03 18:56:09.20423089 +0000 UTC m=+89.802825364" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.206181 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vvkjw"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.208877 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.212835 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.214300 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.214572 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.214817 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.219879 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.219877 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.220745 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrz57"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.220913 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.221342 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.221470 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.222101 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.222595 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.222818 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-457cr"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.223200 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.224921 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2mkdc"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.226229 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlqm9"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.226315 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.227310 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.227523 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.228551 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.229241 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kk955"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.233012 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.236954 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.237292 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.237719 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kd2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238004 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hl427"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238546 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238672 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238698 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238898 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238940 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.237996 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.239012 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.239116 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238127 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238177 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238462 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.239350 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238575 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238641 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238650 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238715 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238753 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238790 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238905 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238947 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238963 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.238998 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.240613 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.240766 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.240852 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.240944 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241023 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241097 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241168 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241298 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241380 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241485 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241617 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241731 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241869 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-drbj7"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.241978 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242057 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242125 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242508 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242549 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242670 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242720 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242781 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242877 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242927 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.242984 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.243076 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.243502 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.244729 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.244828 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245054 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245143 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245233 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245285 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245339 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.245450 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.248155 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.250350 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.250798 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mbm7g"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.251241 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.251431 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.265866 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.266073 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.266547 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.266661 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.266744 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.266682 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.267070 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.269037 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.270413 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.271630 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.273026 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.274993 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.300086 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.300442 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.300690 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302441 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302538 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302563 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302584 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302644 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302713 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302720 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302983 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303083 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303125 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.302992 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303185 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303275 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303348 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.303501 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.308702 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.309195 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-q4mkw"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.309820 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kh5lp"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.310444 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.310724 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.311737 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.312317 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.312774 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.313525 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.314127 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.313853 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.314009 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.314097 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.314441 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.314612 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.315502 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.315621 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.315788 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.315800 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.316061 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.317245 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.318964 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.319146 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.319886 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.321114 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.322737 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.323297 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.323637 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.323897 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.323956 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.323993 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.333534 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.333741 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.334615 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.335202 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.335695 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.336702 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.336843 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.336805 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.336799 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.338019 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341022 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341068 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-serving-cert\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341089 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4fj\" (UniqueName: \"kubernetes.io/projected/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-kube-api-access-mx4fj\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341112 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w6dj\" (UniqueName: \"kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341134 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvbxz\" (UniqueName: \"kubernetes.io/projected/0b338d3c-19f3-42eb-bc21-89d971d0d38e-kube-api-access-vvbxz\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341166 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7kv\" (UniqueName: \"kubernetes.io/projected/ae92a1b7-7488-465a-bf52-0cdc4de799f3-kube-api-access-9j7kv\") pod \"downloads-7954f5f757-2mkdc\" (UID: \"ae92a1b7-7488-465a-bf52-0cdc4de799f3\") " pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341194 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341221 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-images\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341247 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11140349-9793-456a-8ac2-5bcd0e917ea1-proxy-tls\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341285 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-images\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341304 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a43226-6c7b-43bf-a154-093348017ac8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341329 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccad752d-e471-46b3-a898-aa85884563a7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341351 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-audit-policies\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341375 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-serving-cert\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341455 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341474 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.341496 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-etcd-client\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.348611 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.348741 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.348803 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.348847 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccad752d-e471-46b3-a898-aa85884563a7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.348923 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvgm\" (UniqueName: \"kubernetes.io/projected/7c1d214a-8055-4d3b-9131-c2c2510b2939-kube-api-access-ltvgm\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349040 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-node-pullsecrets\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349095 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349116 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349137 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349188 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349219 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349244 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56v8\" (UniqueName: \"kubernetes.io/projected/53338979-623e-4f88-8f10-41d65be09af5-kube-api-access-n56v8\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349303 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349407 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-serving-cert\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349469 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrlc\" (UniqueName: \"kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349518 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349573 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349669 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5nv\" (UniqueName: \"kubernetes.io/projected/00ce529b-1427-4c22-9c32-4f4a81aee646-kube-api-access-6r5nv\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349726 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcpd\" (UniqueName: \"kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349792 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120c759b-ba87-4faf-ac20-e8d340e845ac-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349910 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrqm\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-kube-api-access-wfrqm\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.349978 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-auth-proxy-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350022 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-serving-cert\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350078 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350123 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-client\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350199 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350241 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/120c759b-ba87-4faf-ac20-e8d340e845ac-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350290 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-service-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350344 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fblt\" (UniqueName: \"kubernetes.io/projected/a3a43226-6c7b-43bf-a154-093348017ac8-kube-api-access-9fblt\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350390 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350433 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-image-import-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350460 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350498 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxjp\" (UniqueName: \"kubernetes.io/projected/4893d018-3556-4292-9c3f-b7741732e8eb-kube-api-access-5nxjp\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350546 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkh27\" (UniqueName: \"kubernetes.io/projected/11140349-9793-456a-8ac2-5bcd0e917ea1-kube-api-access-zkh27\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350583 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350644 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350681 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b338d3c-19f3-42eb-bc21-89d971d0d38e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350742 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-client\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350778 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae83066-0d44-4bc0-9052-c255c65d821c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350804 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350857 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350884 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350905 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c4b117-c28a-484b-854a-0c2145d7d881-serving-cert\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350924 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b338d3c-19f3-42eb-bc21-89d971d0d38e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350943 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-config\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350961 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae83066-0d44-4bc0-9052-c255c65d821c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.350981 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.351014 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.351034 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.351036 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352037 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352481 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.351052 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-config\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352636 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352658 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mmcg\" (UniqueName: \"kubernetes.io/projected/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-kube-api-access-8mmcg\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352695 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccad752d-e471-46b3-a898-aa85884563a7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352714 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352729 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-encryption-config\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352745 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4c4b117-c28a-484b-854a-0c2145d7d881-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352769 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmlb\" (UniqueName: \"kubernetes.io/projected/f4c4b117-c28a-484b-854a-0c2145d7d881-kube-api-access-csmlb\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352784 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ce529b-1427-4c22-9c32-4f4a81aee646-serving-cert\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352813 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit-dir\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352834 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c1d214a-8055-4d3b-9131-c2c2510b2939-machine-approver-tls\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352849 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-encryption-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352874 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sv8d\" (UniqueName: \"kubernetes.io/projected/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-kube-api-access-4sv8d\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352890 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfwr\" (UniqueName: \"kubernetes.io/projected/ca25a279-e208-48df-b048-c4c7fa91c9a2-kube-api-access-skfwr\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352907 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-config\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352925 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352943 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca25a279-e208-48df-b048-c4c7fa91c9a2-metrics-tls\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352960 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352977 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqsr\" (UniqueName: \"kubernetes.io/projected/9ae83066-0d44-4bc0-9052-c255c65d821c-kube-api-access-8kqsr\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352994 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353008 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-config\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353022 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120c759b-ba87-4faf-ac20-e8d340e845ac-config\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353044 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353064 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353081 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4893d018-3556-4292-9c3f-b7741732e8eb-audit-dir\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353093 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-trusted-ca\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353126 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353143 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.353211 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.352924 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.354031 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.354372 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.354527 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.354872 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.355381 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.355406 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.358354 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.358739 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.359592 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.361085 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.362125 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.363782 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbd9p"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.365144 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.366006 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b654x"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.366462 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.367144 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.368084 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2mkdc"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.369722 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vvkjw"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.371053 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.371688 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.372328 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.372807 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrz57"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.373935 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.374381 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.375317 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.377482 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.378375 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.379399 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kd2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.380676 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.381698 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlqm9"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.382715 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mbm7g"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.383823 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.385108 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.387138 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.388065 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kh5lp"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.389350 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.390601 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-457cr"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.390723 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.392030 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.393182 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.394381 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.395367 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xwlrx"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.396015 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.396364 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hl427"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.397520 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.398669 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.399862 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rnmc9"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.400501 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.401756 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.402966 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.403785 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.404813 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwlrx"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.411019 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.413171 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.414707 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.416104 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.418016 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbd9p"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.419393 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-drbj7"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.420733 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.422279 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b654x"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.423708 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.425063 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zf9fq"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.425877 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.426377 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqx27"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.427486 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.427865 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.429223 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.430685 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zf9fq"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.430761 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.432028 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqx27"] Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.451645 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454430 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120c759b-ba87-4faf-ac20-e8d340e845ac-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454458 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5nv\" (UniqueName: \"kubernetes.io/projected/00ce529b-1427-4c22-9c32-4f4a81aee646-kube-api-access-6r5nv\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454480 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcpd\" (UniqueName: \"kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454504 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnk4\" (UniqueName: \"kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454524 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrqm\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-kube-api-access-wfrqm\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454544 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454575 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88affbdd-aaa2-449a-a1f7-f3bb203f176f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454600 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/febe3747-e9a5-4cde-84ee-2b7708794897-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454625 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454645 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454670 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/120c759b-ba87-4faf-ac20-e8d340e845ac-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454690 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454711 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fblt\" (UniqueName: \"kubernetes.io/projected/a3a43226-6c7b-43bf-a154-093348017ac8-kube-api-access-9fblt\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454733 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454753 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxjp\" (UniqueName: \"kubernetes.io/projected/4893d018-3556-4292-9c3f-b7741732e8eb-kube-api-access-5nxjp\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454779 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454804 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454825 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/47f3df42-1762-4b23-a3b2-af671d9126df-tmpfs\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454848 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b338d3c-19f3-42eb-bc21-89d971d0d38e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454869 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-client\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454891 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae83066-0d44-4bc0-9052-c255c65d821c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454913 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454931 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-config\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454953 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-config\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454975 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae83066-0d44-4bc0-9052-c255c65d821c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.454995 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455016 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455041 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455062 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mmcg\" (UniqueName: \"kubernetes.io/projected/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-kube-api-access-8mmcg\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455083 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ce529b-1427-4c22-9c32-4f4a81aee646-serving-cert\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455108 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4c4b117-c28a-484b-854a-0c2145d7d881-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455128 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csmlb\" (UniqueName: \"kubernetes.io/projected/f4c4b117-c28a-484b-854a-0c2145d7d881-kube-api-access-csmlb\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455147 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-encryption-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455166 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455190 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455213 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skfwr\" (UniqueName: \"kubernetes.io/projected/ca25a279-e208-48df-b048-c4c7fa91c9a2-kube-api-access-skfwr\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455233 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-config\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455275 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scw6f\" (UniqueName: \"kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455298 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455319 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqsr\" (UniqueName: \"kubernetes.io/projected/9ae83066-0d44-4bc0-9052-c255c65d821c-kube-api-access-8kqsr\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455339 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455358 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455379 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-trusted-ca\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455406 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4fj\" (UniqueName: \"kubernetes.io/projected/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-kube-api-access-mx4fj\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455426 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455448 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2qj\" (UniqueName: \"kubernetes.io/projected/febe3747-e9a5-4cde-84ee-2b7708794897-kube-api-access-nw2qj\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455470 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11140349-9793-456a-8ac2-5bcd0e917ea1-proxy-tls\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455489 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-images\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455508 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a43226-6c7b-43bf-a154-093348017ac8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455527 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455546 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455567 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-audit-policies\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455588 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455609 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455664 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455690 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88affbdd-aaa2-449a-a1f7-f3bb203f176f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455714 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455743 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvgm\" (UniqueName: \"kubernetes.io/projected/7c1d214a-8055-4d3b-9131-c2c2510b2939-kube-api-access-ltvgm\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455763 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4cpg\" (UniqueName: \"kubernetes.io/projected/47f3df42-1762-4b23-a3b2-af671d9126df-kube-api-access-w4cpg\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455785 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455804 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pfc\" (UniqueName: \"kubernetes.io/projected/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-kube-api-access-v7pfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455826 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455844 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455865 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455897 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-serving-cert\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455916 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88affbdd-aaa2-449a-a1f7-f3bb203f176f-config\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455937 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-auth-proxy-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455977 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-serving-cert\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.455997 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-client\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456018 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-service-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456050 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-image-import-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456075 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkh27\" (UniqueName: \"kubernetes.io/projected/11140349-9793-456a-8ac2-5bcd0e917ea1-kube-api-access-zkh27\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456094 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456127 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456148 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456171 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456193 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c4b117-c28a-484b-854a-0c2145d7d881-serving-cert\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456213 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b338d3c-19f3-42eb-bc21-89d971d0d38e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456234 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456270 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccad752d-e471-46b3-a898-aa85884563a7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456293 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456314 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-encryption-config\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456334 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit-dir\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456355 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c1d214a-8055-4d3b-9131-c2c2510b2939-machine-approver-tls\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456378 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sv8d\" (UniqueName: \"kubernetes.io/projected/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-kube-api-access-4sv8d\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456399 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca25a279-e208-48df-b048-c4c7fa91c9a2-metrics-tls\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456423 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456447 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6vgg\" (UniqueName: \"kubernetes.io/projected/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-kube-api-access-f6vgg\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456469 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456490 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120c759b-ba87-4faf-ac20-e8d340e845ac-config\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456512 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456533 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-config\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456556 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4893d018-3556-4292-9c3f-b7741732e8eb-audit-dir\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456575 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456604 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456623 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456647 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456667 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-serving-cert\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456688 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w6dj\" (UniqueName: \"kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456710 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvbxz\" (UniqueName: \"kubernetes.io/projected/0b338d3c-19f3-42eb-bc21-89d971d0d38e-kube-api-access-vvbxz\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456733 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j7kv\" (UniqueName: \"kubernetes.io/projected/ae92a1b7-7488-465a-bf52-0cdc4de799f3-kube-api-access-9j7kv\") pod \"downloads-7954f5f757-2mkdc\" (UID: \"ae92a1b7-7488-465a-bf52-0cdc4de799f3\") " pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456754 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456775 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-images\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456797 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccad752d-e471-46b3-a898-aa85884563a7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456817 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-etcd-client\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456836 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-serving-cert\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456877 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456898 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456920 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccad752d-e471-46b3-a898-aa85884563a7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456942 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456964 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-node-pullsecrets\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.456984 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457008 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56v8\" (UniqueName: \"kubernetes.io/projected/53338979-623e-4f88-8f10-41d65be09af5-kube-api-access-n56v8\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457030 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457053 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457075 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457098 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrlc\" (UniqueName: \"kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457120 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457143 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457165 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457188 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.457927 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.460702 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.460723 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-config\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.461165 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.461760 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.462019 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-auth-proxy-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.462209 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-service-ca-bundle\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.462689 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.463142 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b338d3c-19f3-42eb-bc21-89d971d0d38e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.463904 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-audit-policies\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.464901 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.464906 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1d214a-8055-4d3b-9131-c2c2510b2939-config\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.464909 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-images\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.465234 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.466270 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-trusted-ca\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.466778 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4893d018-3556-4292-9c3f-b7741732e8eb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.466804 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.467099 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-serving-cert\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.467297 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.467535 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.467694 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.468442 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.468488 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-node-pullsecrets\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.473896 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.474132 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b338d3c-19f3-42eb-bc21-89d971d0d38e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.474978 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-config\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.475406 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.475610 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce529b-1427-4c22-9c32-4f4a81aee646-config\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.475701 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccad752d-e471-46b3-a898-aa85884563a7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.475792 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.475826 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.476195 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f4c4b117-c28a-484b-854a-0c2145d7d881-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.479372 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4893d018-3556-4292-9c3f-b7741732e8eb-audit-dir\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478085 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478155 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-image-import-ca\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478594 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478714 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c4b117-c28a-484b-854a-0c2145d7d881-serving-cert\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478845 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.478891 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-audit-dir\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.479207 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.479292 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.479317 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.476467 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.480006 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.480124 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ccad752d-e471-46b3-a898-aa85884563a7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.480384 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53338979-623e-4f88-8f10-41d65be09af5-etcd-service-ca\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.480617 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a43226-6c7b-43bf-a154-093348017ac8-config\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.480910 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.481415 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-serving-cert\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482007 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ce529b-1427-4c22-9c32-4f4a81aee646-serving-cert\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482390 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-etcd-client\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482408 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-etcd-client\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482536 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4893d018-3556-4292-9c3f-b7741732e8eb-encryption-config\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482598 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3a43226-6c7b-43bf-a154-093348017ac8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482742 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482904 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.482964 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-serving-cert\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.483023 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.483062 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.483306 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca25a279-e208-48df-b048-c4c7fa91c9a2-metrics-tls\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.483804 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-encryption-config\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.483885 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.484182 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ae83066-0d44-4bc0-9052-c255c65d821c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.484441 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.484615 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.484619 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccad752d-e471-46b3-a898-aa85884563a7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.484937 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-serving-cert\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.485235 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.485779 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53338979-623e-4f88-8f10-41d65be09af5-etcd-client\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.486068 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c1d214a-8055-4d3b-9131-c2c2510b2939-machine-approver-tls\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.491818 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.495869 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae83066-0d44-4bc0-9052-c255c65d821c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.513014 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.531148 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.552484 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.557631 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/11140349-9793-456a-8ac2-5bcd0e917ea1-images\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558034 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2qj\" (UniqueName: \"kubernetes.io/projected/febe3747-e9a5-4cde-84ee-2b7708794897-kube-api-access-nw2qj\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558065 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558090 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558105 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558277 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558296 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88affbdd-aaa2-449a-a1f7-f3bb203f176f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558324 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4cpg\" (UniqueName: \"kubernetes.io/projected/47f3df42-1762-4b23-a3b2-af671d9126df-kube-api-access-w4cpg\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558421 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pfc\" (UniqueName: \"kubernetes.io/projected/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-kube-api-access-v7pfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558447 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558537 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88affbdd-aaa2-449a-a1f7-f3bb203f176f-config\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558592 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558627 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558724 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6vgg\" (UniqueName: \"kubernetes.io/projected/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-kube-api-access-f6vgg\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558780 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558821 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558855 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558873 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558887 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558916 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnk4\" (UniqueName: \"kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558935 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558958 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/febe3747-e9a5-4cde-84ee-2b7708794897-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.558974 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88affbdd-aaa2-449a-a1f7-f3bb203f176f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.559403 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.559428 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/47f3df42-1762-4b23-a3b2-af671d9126df-tmpfs\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.559533 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.559557 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.559643 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scw6f\" (UniqueName: \"kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.560037 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/47f3df42-1762-4b23-a3b2-af671d9126df-tmpfs\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.571447 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.590827 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.602827 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120c759b-ba87-4faf-ac20-e8d340e845ac-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.611586 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.620421 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120c759b-ba87-4faf-ac20-e8d340e845ac-config\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.630823 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.650965 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.657631 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11140349-9793-456a-8ac2-5bcd0e917ea1-proxy-tls\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.671238 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.691354 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.711548 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.731535 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.751170 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.771158 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.791531 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.811485 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.830907 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.850997 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.891502 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.911014 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.931644 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.950562 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.972749 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 18:56:09 crc kubenswrapper[4731]: I1203 18:56:09.991447 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.011168 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.031217 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.042915 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/febe3747-e9a5-4cde-84ee-2b7708794897-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.051149 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.072099 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.091267 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.110956 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.132081 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.151621 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.163163 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.171837 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.196245 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.199502 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.210690 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.222118 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.222213 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47f3df42-1762-4b23-a3b2-af671d9126df-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.231962 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.239737 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.251449 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.262276 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88affbdd-aaa2-449a-a1f7-f3bb203f176f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.271500 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.281077 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.291639 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.302098 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.311480 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.325493 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.331989 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.349517 4731 request.go:700] Waited for 1.010384105s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Dconsole-dockercfg-f62pw&limit=500&resourceVersion=0 Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.351410 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.375906 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.380376 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.391885 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.411726 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.436184 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.439898 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.451002 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.459822 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88affbdd-aaa2-449a-a1f7-f3bb203f176f-config\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.471608 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.491448 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.503330 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.512192 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.531795 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.550935 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559058 4731 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559160 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle podName:7460fcca-b017-4d16-9d8a-8d3c4cc910e9 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:11.059136172 +0000 UTC m=+91.657730636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle") pod "service-ca-9c57cc56f-bbd9p" (UID: "7460fcca-b017-4d16-9d8a-8d3c4cc910e9") : failed to sync configmap cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559410 4731 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559449 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert podName:3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad nodeName:}" failed. No retries permitted until 2025-12-03 18:56:11.059438123 +0000 UTC m=+91.658032577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" (UID: "3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad") : failed to sync secret cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559465 4731 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559475 4731 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559541 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config podName:3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad nodeName:}" failed. No retries permitted until 2025-12-03 18:56:11.059518726 +0000 UTC m=+91.658113190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" (UID: "3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad") : failed to sync configmap cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: E1203 18:56:10.559639 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key podName:7460fcca-b017-4d16-9d8a-8d3c4cc910e9 nodeName:}" failed. No retries permitted until 2025-12-03 18:56:11.059624879 +0000 UTC m=+91.658219433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key") pod "service-ca-9c57cc56f-bbd9p" (UID: "7460fcca-b017-4d16-9d8a-8d3c4cc910e9") : failed to sync secret cache: timed out waiting for the condition Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.571244 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.598226 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.611926 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.631925 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.651736 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.671516 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.691724 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.712428 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.731585 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.751545 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.771100 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.791274 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.811150 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.831595 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.851589 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.856046 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.856107 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.856195 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.856296 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.872225 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.891745 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.911925 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.931319 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.951989 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.972327 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 18:56:10 crc kubenswrapper[4731]: I1203 18:56:10.992164 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.011533 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.031931 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.051472 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.070876 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.079570 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.079659 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.079714 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.079891 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.080888 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-cabundle\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.080925 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.083172 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-signing-key\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.084900 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.092181 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.111452 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.131489 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.150852 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.171538 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.191741 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.231945 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.252297 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.271764 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.292961 4731 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.312069 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.331537 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.349551 4731 request.go:700] Waited for 1.89085365s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.366156 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5nv\" (UniqueName: \"kubernetes.io/projected/00ce529b-1427-4c22-9c32-4f4a81aee646-kube-api-access-6r5nv\") pod \"console-operator-58897d9998-457cr\" (UID: \"00ce529b-1427-4c22-9c32-4f4a81aee646\") " pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.387063 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcpd\" (UniqueName: \"kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd\") pod \"route-controller-manager-6576b87f9c-2z65s\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.410461 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrqm\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-kube-api-access-wfrqm\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.426584 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/120c759b-ba87-4faf-ac20-e8d340e845ac-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tj2gg\" (UID: \"120c759b-ba87-4faf-ac20-e8d340e845ac\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.438025 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.445944 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fblt\" (UniqueName: \"kubernetes.io/projected/a3a43226-6c7b-43bf-a154-093348017ac8-kube-api-access-9fblt\") pod \"machine-api-operator-5694c8668f-vvkjw\" (UID: \"a3a43226-6c7b-43bf-a154-093348017ac8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.466544 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqsr\" (UniqueName: \"kubernetes.io/projected/9ae83066-0d44-4bc0-9052-c255c65d821c-kube-api-access-8kqsr\") pod \"kube-storage-version-migrator-operator-b67b599dd-7spwt\" (UID: \"9ae83066-0d44-4bc0-9052-c255c65d821c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.498134 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxjp\" (UniqueName: \"kubernetes.io/projected/4893d018-3556-4292-9c3f-b7741732e8eb-kube-api-access-5nxjp\") pod \"apiserver-7bbb656c7d-xrbtb\" (UID: \"4893d018-3556-4292-9c3f-b7741732e8eb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.505466 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4fj\" (UniqueName: \"kubernetes.io/projected/8c6e4ce0-c1e7-4aa9-8b20-3949db22818e-kube-api-access-mx4fj\") pod \"apiserver-76f77b778f-nrz57\" (UID: \"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e\") " pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.509766 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.527004 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe9f6fa9-efd7-4569-9c8b-c6932458eec7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xplb6\" (UID: \"fe9f6fa9-efd7-4569-9c8b-c6932458eec7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.541897 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.551597 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvgm\" (UniqueName: \"kubernetes.io/projected/7c1d214a-8055-4d3b-9131-c2c2510b2939-kube-api-access-ltvgm\") pod \"machine-approver-56656f9798-kk955\" (UID: \"7c1d214a-8055-4d3b-9131-c2c2510b2939\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.568662 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfwr\" (UniqueName: \"kubernetes.io/projected/ca25a279-e208-48df-b048-c4c7fa91c9a2-kube-api-access-skfwr\") pod \"dns-operator-744455d44c-mbm7g\" (UID: \"ca25a279-e208-48df-b048-c4c7fa91c9a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.591206 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j7kv\" (UniqueName: \"kubernetes.io/projected/ae92a1b7-7488-465a-bf52-0cdc4de799f3-kube-api-access-9j7kv\") pod \"downloads-7954f5f757-2mkdc\" (UID: \"ae92a1b7-7488-465a-bf52-0cdc4de799f3\") " pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.607939 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvbxz\" (UniqueName: \"kubernetes.io/projected/0b338d3c-19f3-42eb-bc21-89d971d0d38e-kube-api-access-vvbxz\") pod \"openshift-apiserver-operator-796bbdcf4f-h5w5k\" (UID: \"0b338d3c-19f3-42eb-bc21-89d971d0d38e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.617748 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.626948 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w6dj\" (UniqueName: \"kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj\") pod \"oauth-openshift-558db77b4-wlqm9\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.628776 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-457cr"] Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.637858 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.647620 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrlc\" (UniqueName: \"kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc\") pod \"controller-manager-879f6c89f-q7prx\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.665564 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccad752d-e471-46b3-a898-aa85884563a7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2hrqb\" (UID: \"ccad752d-e471-46b3-a898-aa85884563a7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.668022 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.688567 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmlb\" (UniqueName: \"kubernetes.io/projected/f4c4b117-c28a-484b-854a-0c2145d7d881-kube-api-access-csmlb\") pod \"openshift-config-operator-7777fb866f-hl427\" (UID: \"f4c4b117-c28a-484b-854a-0c2145d7d881\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.697727 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.708929 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sv8d\" (UniqueName: \"kubernetes.io/projected/b958e8b3-a0cb-4bb0-b051-befcfc7fad43-kube-api-access-4sv8d\") pod \"cluster-samples-operator-665b6dd947-9cslg\" (UID: \"b958e8b3-a0cb-4bb0-b051-befcfc7fad43\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.710574 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.713476 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.731897 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mmcg\" (UniqueName: \"kubernetes.io/projected/f6eaf9e7-49e9-4438-bd42-659e9f4df03f-kube-api-access-8mmcg\") pod \"authentication-operator-69f744f599-75kd2\" (UID: \"f6eaf9e7-49e9-4438-bd42-659e9f4df03f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.746140 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkh27\" (UniqueName: \"kubernetes.io/projected/11140349-9793-456a-8ac2-5bcd0e917ea1-kube-api-access-zkh27\") pod \"machine-config-operator-74547568cd-q2pc2\" (UID: \"11140349-9793-456a-8ac2-5bcd0e917ea1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.757874 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.758864 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt"] Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.765682 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.773333 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.777410 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56v8\" (UniqueName: \"kubernetes.io/projected/53338979-623e-4f88-8f10-41d65be09af5-kube-api-access-n56v8\") pod \"etcd-operator-b45778765-drbj7\" (UID: \"53338979-623e-4f88-8f10-41d65be09af5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.780206 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.786538 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2qj\" (UniqueName: \"kubernetes.io/projected/febe3747-e9a5-4cde-84ee-2b7708794897-kube-api-access-nw2qj\") pod \"package-server-manager-789f6589d5-m6cmg\" (UID: \"febe3747-e9a5-4cde-84ee-2b7708794897\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.794359 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.803577 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.809583 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88affbdd-aaa2-449a-a1f7-f3bb203f176f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fx564\" (UID: \"88affbdd-aaa2-449a-a1f7-f3bb203f176f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.822401 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.824954 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.829286 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pfc\" (UniqueName: \"kubernetes.io/projected/f0b43134-d5ba-4f4b-bd4b-e5a838d23b18-kube-api-access-v7pfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-k7gjd\" (UID: \"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.830585 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vvkjw"] Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.836402 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.847488 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.849640 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4cpg\" (UniqueName: \"kubernetes.io/projected/47f3df42-1762-4b23-a3b2-af671d9126df-kube-api-access-w4cpg\") pod \"packageserver-d55dfcdfc-6kdxt\" (UID: \"47f3df42-1762-4b23-a3b2-af671d9126df\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.861989 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.873387 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d9ltn\" (UID: \"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.888396 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnk4\" (UniqueName: \"kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4\") pod \"console-f9d7485db-65n9c\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.911315 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6vgg\" (UniqueName: \"kubernetes.io/projected/7460fcca-b017-4d16-9d8a-8d3c4cc910e9-kube-api-access-f6vgg\") pod \"service-ca-9c57cc56f-bbd9p\" (UID: \"7460fcca-b017-4d16-9d8a-8d3c4cc910e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.922619 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.929148 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scw6f\" (UniqueName: \"kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f\") pod \"marketplace-operator-79b997595-srnnr\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.953927 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.955539 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.971701 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.977738 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:11 crc kubenswrapper[4731]: I1203 18:56:11.992419 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.005793 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.008414 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.011618 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.016767 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrz57"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.017954 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.026006 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.032294 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.032549 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.035755 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.039699 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.053111 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.069944 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.074338 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2mkdc"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.085542 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103841 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78ccef1-3b10-4bee-b8b7-19beedacb04b-cert\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103872 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5r2\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-kube-api-access-gj5r2\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103890 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103922 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqvp\" (UniqueName: \"kubernetes.io/projected/d856a441-2665-48cd-aed6-d48c9ff0f3c4-kube-api-access-fmqvp\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103958 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.103975 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b586f69-0fbe-4e39-b741-db5e4fd70d21-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104035 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104053 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104069 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-metrics-certs\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104100 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfsj\" (UniqueName: \"kubernetes.io/projected/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-kube-api-access-7qfsj\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104120 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbp72\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104137 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8fk\" (UniqueName: \"kubernetes.io/projected/f78ccef1-3b10-4bee-b8b7-19beedacb04b-kube-api-access-fz8fk\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104154 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2w6\" (UniqueName: \"kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104189 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp785\" (UniqueName: \"kubernetes.io/projected/3fbd46a2-b020-4bae-9c09-579b17677c8f-kube-api-access-kp785\") pod \"migrator-59844c95c7-hgf5k\" (UID: \"3fbd46a2-b020-4bae-9c09-579b17677c8f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104220 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-srv-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104239 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-config\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104286 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74dt\" (UniqueName: \"kubernetes.io/projected/ef7bee06-2bfb-4426-82a3-410ef3205bee-kube-api-access-g74dt\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104308 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf996a8-0dfa-4d25-9112-5a3c4688cb77-service-ca-bundle\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104335 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e30694-94d5-4131-87ef-dfd2e1acec56-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104352 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnr2b\" (UniqueName: \"kubernetes.io/projected/65e30694-94d5-4131-87ef-dfd2e1acec56-kube-api-access-mnr2b\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104367 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-stats-auth\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104384 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstcn\" (UniqueName: \"kubernetes.io/projected/7b586f69-0fbe-4e39-b741-db5e4fd70d21-kube-api-access-hstcn\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104415 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104433 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/858f1146-e313-4cc2-a195-01fa1a85e62e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104476 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104502 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b586f69-0fbe-4e39-b741-db5e4fd70d21-proxy-tls\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104517 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-profile-collector-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104532 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104560 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104574 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-default-certificate\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104596 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wxg\" (UniqueName: \"kubernetes.io/projected/acf996a8-0dfa-4d25-9112-5a3c4688cb77-kube-api-access-v9wxg\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104630 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6gq4\" (UniqueName: \"kubernetes.io/projected/149fd6d7-3127-457a-8d8b-03de32877b50-kube-api-access-h6gq4\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104646 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldqh\" (UniqueName: \"kubernetes.io/projected/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-kube-api-access-vldqh\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104671 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104686 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/858f1146-e313-4cc2-a195-01fa1a85e62e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104707 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104740 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-certs\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104765 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-node-bootstrap-token\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104797 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-srv-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104813 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e30694-94d5-4131-87ef-dfd2e1acec56-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104846 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-serving-cert\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104861 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.104892 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.108408 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:12.60839278 +0000 UTC m=+93.206987234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.206238 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.207417 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-certs\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.207464 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-metrics-tls\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.207508 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-node-bootstrap-token\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.207523 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-config-volume\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.207978 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-registration-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208005 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-srv-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208049 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e30694-94d5-4131-87ef-dfd2e1acec56-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208065 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-mountpoint-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208082 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-serving-cert\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208104 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208121 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208178 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-plugins-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208203 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78ccef1-3b10-4bee-b8b7-19beedacb04b-cert\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208220 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5r2\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-kube-api-access-gj5r2\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208235 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208266 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqvp\" (UniqueName: \"kubernetes.io/projected/d856a441-2665-48cd-aed6-d48c9ff0f3c4-kube-api-access-fmqvp\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208291 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208317 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b586f69-0fbe-4e39-b741-db5e4fd70d21-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208351 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208366 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208382 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-metrics-certs\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208402 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnz9\" (UniqueName: \"kubernetes.io/projected/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-kube-api-access-fbnz9\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208428 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbp72\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208446 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfsj\" (UniqueName: \"kubernetes.io/projected/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-kube-api-access-7qfsj\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8fk\" (UniqueName: \"kubernetes.io/projected/f78ccef1-3b10-4bee-b8b7-19beedacb04b-kube-api-access-fz8fk\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208509 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz2w6\" (UniqueName: \"kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208534 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp785\" (UniqueName: \"kubernetes.io/projected/3fbd46a2-b020-4bae-9c09-579b17677c8f-kube-api-access-kp785\") pod \"migrator-59844c95c7-hgf5k\" (UID: \"3fbd46a2-b020-4bae-9c09-579b17677c8f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208592 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-srv-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208617 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-config\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208633 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74dt\" (UniqueName: \"kubernetes.io/projected/ef7bee06-2bfb-4426-82a3-410ef3205bee-kube-api-access-g74dt\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208659 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf996a8-0dfa-4d25-9112-5a3c4688cb77-service-ca-bundle\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208829 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e30694-94d5-4131-87ef-dfd2e1acec56-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208847 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnr2b\" (UniqueName: \"kubernetes.io/projected/65e30694-94d5-4131-87ef-dfd2e1acec56-kube-api-access-mnr2b\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208867 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-stats-auth\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208883 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstcn\" (UniqueName: \"kubernetes.io/projected/7b586f69-0fbe-4e39-b741-db5e4fd70d21-kube-api-access-hstcn\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208915 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/858f1146-e313-4cc2-a195-01fa1a85e62e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208940 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.208983 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b586f69-0fbe-4e39-b741-db5e4fd70d21-proxy-tls\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209008 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209022 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-profile-collector-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209071 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209088 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-default-certificate\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209180 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wxg\" (UniqueName: \"kubernetes.io/projected/acf996a8-0dfa-4d25-9112-5a3c4688cb77-kube-api-access-v9wxg\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.209283 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:12.709203364 +0000 UTC m=+93.307797838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.209439 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6gq4\" (UniqueName: \"kubernetes.io/projected/149fd6d7-3127-457a-8d8b-03de32877b50-kube-api-access-h6gq4\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.210890 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vldqh\" (UniqueName: \"kubernetes.io/projected/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-kube-api-access-vldqh\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.210936 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-socket-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.210962 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-csi-data-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.211003 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/858f1146-e313-4cc2-a195-01fa1a85e62e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.211026 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.211065 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.211087 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj47w\" (UniqueName: \"kubernetes.io/projected/0f05323b-fadc-4481-a4b9-112753505b1e-kube-api-access-zj47w\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.221817 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e30694-94d5-4131-87ef-dfd2e1acec56-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.238418 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/858f1146-e313-4cc2-a195-01fa1a85e62e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.239235 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-certs\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.239583 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-metrics-certs\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.239868 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-srv-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.239865 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.240028 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.240645 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/149fd6d7-3127-457a-8d8b-03de32877b50-node-bootstrap-token\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.241243 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65e30694-94d5-4131-87ef-dfd2e1acec56-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.241287 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b586f69-0fbe-4e39-b741-db5e4fd70d21-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.242030 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.242378 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.242874 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf996a8-0dfa-4d25-9112-5a3c4688cb77-service-ca-bundle\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.243633 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.243899 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78ccef1-3b10-4bee-b8b7-19beedacb04b-cert\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.244232 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-config\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.245907 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.246597 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-serving-cert\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.246773 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-stats-auth\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.247052 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.248621 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b586f69-0fbe-4e39-b741-db5e4fd70d21-proxy-tls\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.252211 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-profile-collector-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.255365 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d856a441-2665-48cd-aed6-d48c9ff0f3c4-srv-cert\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.256475 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef7bee06-2bfb-4426-82a3-410ef3205bee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.264230 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/858f1146-e313-4cc2-a195-01fa1a85e62e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.267781 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.272704 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/acf996a8-0dfa-4d25-9112-5a3c4688cb77-default-certificate\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.273063 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.286872 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnr2b\" (UniqueName: \"kubernetes.io/projected/65e30694-94d5-4131-87ef-dfd2e1acec56-kube-api-access-mnr2b\") pod \"openshift-controller-manager-operator-756b6f6bc6-xrx95\" (UID: \"65e30694-94d5-4131-87ef-dfd2e1acec56\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.288169 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6gq4\" (UniqueName: \"kubernetes.io/projected/149fd6d7-3127-457a-8d8b-03de32877b50-kube-api-access-h6gq4\") pod \"machine-config-server-rnmc9\" (UID: \"149fd6d7-3127-457a-8d8b-03de32877b50\") " pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.314600 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-plugins-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.314965 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnz9\" (UniqueName: \"kubernetes.io/projected/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-kube-api-access-fbnz9\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315069 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315151 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-socket-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315177 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-csi-data-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315177 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp785\" (UniqueName: \"kubernetes.io/projected/3fbd46a2-b020-4bae-9c09-579b17677c8f-kube-api-access-kp785\") pod \"migrator-59844c95c7-hgf5k\" (UID: \"3fbd46a2-b020-4bae-9c09-579b17677c8f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315204 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj47w\" (UniqueName: \"kubernetes.io/projected/0f05323b-fadc-4481-a4b9-112753505b1e-kube-api-access-zj47w\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315277 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-metrics-tls\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315302 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-config-volume\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315326 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-registration-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315359 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-mountpoint-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315446 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-mountpoint-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.315507 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:12.815492766 +0000 UTC m=+93.414087220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315632 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-socket-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315713 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-csi-data-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.315878 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-plugins-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.316420 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0f05323b-fadc-4481-a4b9-112753505b1e-registration-dir\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.317373 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-config-volume\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.323804 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-metrics-tls\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.336825 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstcn\" (UniqueName: \"kubernetes.io/projected/7b586f69-0fbe-4e39-b741-db5e4fd70d21-kube-api-access-hstcn\") pod \"machine-config-controller-84d6567774-lb29g\" (UID: \"7b586f69-0fbe-4e39-b741-db5e4fd70d21\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.365173 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.373183 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8fk\" (UniqueName: \"kubernetes.io/projected/f78ccef1-3b10-4bee-b8b7-19beedacb04b-kube-api-access-fz8fk\") pod \"ingress-canary-xwlrx\" (UID: \"f78ccef1-3b10-4bee-b8b7-19beedacb04b\") " pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.377123 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.382346 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlqm9"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.397148 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz2w6\" (UniqueName: \"kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6\") pod \"collect-profiles-29413125-g6bsm\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.399874 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.408441 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwlrx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.413587 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wxg\" (UniqueName: \"kubernetes.io/projected/acf996a8-0dfa-4d25-9112-5a3c4688cb77-kube-api-access-v9wxg\") pod \"router-default-5444994796-q4mkw\" (UID: \"acf996a8-0dfa-4d25-9112-5a3c4688cb77\") " pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.416178 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.416596 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:12.91658042 +0000 UTC m=+93.515174884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.416669 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rnmc9" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.427084 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfsj\" (UniqueName: \"kubernetes.io/projected/f2c36b28-04ab-4880-a79e-11d47a6fe1e6-kube-api-access-7qfsj\") pod \"service-ca-operator-777779d784-b654x\" (UID: \"f2c36b28-04ab-4880-a79e-11d47a6fe1e6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.454157 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5r2\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-kube-api-access-gj5r2\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.466960 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" event={"ID":"9ddaec36-815c-4929-92dc-85e40f218be1","Type":"ContainerStarted","Data":"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.467183 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" event={"ID":"9ddaec36-815c-4929-92dc-85e40f218be1","Type":"ContainerStarted","Data":"6eaf2e87be2b84ba8f06139c0930b9edb39f89e1a3e801f1cd34eb563d5a6198"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.468136 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.493406 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldqh\" (UniqueName: \"kubernetes.io/projected/d7b3325d-acb8-4fe7-a7cc-30df55f51b09-kube-api-access-vldqh\") pod \"multus-admission-controller-857f4d67dd-kh5lp\" (UID: \"d7b3325d-acb8-4fe7-a7cc-30df55f51b09\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.499429 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.499479 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-75kd2"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.513429 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-drbj7"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.516493 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/858f1146-e313-4cc2-a195-01fa1a85e62e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v7ln2\" (UID: \"858f1146-e313-4cc2-a195-01fa1a85e62e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.517228 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.520184 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.020170114 +0000 UTC m=+93.618764578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.529628 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.531287 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hl427"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.548661 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" event={"ID":"4893d018-3556-4292-9c3f-b7741732e8eb","Type":"ContainerStarted","Data":"0fa1e0df635c1015bb3f18bdee3de7a85aa3e50f7f87957aac88128622425e89"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.549459 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.550083 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74dt\" (UniqueName: \"kubernetes.io/projected/ef7bee06-2bfb-4426-82a3-410ef3205bee-kube-api-access-g74dt\") pod \"olm-operator-6b444d44fb-knsx5\" (UID: \"ef7bee06-2bfb-4426-82a3-410ef3205bee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.560516 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqvp\" (UniqueName: \"kubernetes.io/projected/d856a441-2665-48cd-aed6-d48c9ff0f3c4-kube-api-access-fmqvp\") pod \"catalog-operator-68c6474976-z4xtb\" (UID: \"d856a441-2665-48cd-aed6-d48c9ff0f3c4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.560743 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.567724 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbp72\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.568031 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.579282 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj47w\" (UniqueName: \"kubernetes.io/projected/0f05323b-fadc-4481-a4b9-112753505b1e-kube-api-access-zj47w\") pod \"csi-hostpathplugin-jqx27\" (UID: \"0f05323b-fadc-4481-a4b9-112753505b1e\") " pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.588342 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.615968 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnz9\" (UniqueName: \"kubernetes.io/projected/d04233c6-8cf6-4450-aa74-cab1e1d12f4a-kube-api-access-fbnz9\") pod \"dns-default-zf9fq\" (UID: \"d04233c6-8cf6-4450-aa74-cab1e1d12f4a\") " pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.616281 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2mkdc" event={"ID":"ae92a1b7-7488-465a-bf52-0cdc4de799f3","Type":"ContainerStarted","Data":"7f2d6c139efea5c548081af3d4732f413835eec47e369c0836304fe9e1e10e6f"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.617739 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.618166 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.618502 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.118474209 +0000 UTC m=+93.717068673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.648157 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.651627 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.652514 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" event={"ID":"120c759b-ba87-4faf-ac20-e8d340e845ac","Type":"ContainerStarted","Data":"fcd729bf625065cae6128b29ac26f36712d73888d6c6d7c3348964457e3757ca"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.653211 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.655567 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" event={"ID":"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e","Type":"ContainerStarted","Data":"5e14f720ac92ce2fa0d45dfa2fd5e1f4f2bca3fb867e733b6b4628453205efe3"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.658378 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mbm7g"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.680054 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" event={"ID":"2528f79c-1409-4a38-9fef-a5b56cec0d3c","Type":"ContainerStarted","Data":"3859de746066143e9cb9614044ad167d156c0137771a2d874c174769c29c05ba"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.681165 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.682488 4731 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-q7prx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.682554 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.692222 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.693903 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.694369 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.696135 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" event={"ID":"ccad752d-e471-46b3-a898-aa85884563a7","Type":"ContainerStarted","Data":"86a699c6671be1605b8a827898f298ebf6ccbaa1791c986ffd5f6c601bf47236"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.718559 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" event={"ID":"9ae83066-0d44-4bc0-9052-c255c65d821c","Type":"ContainerStarted","Data":"c0d8e767d003676849b2a08323504e9cf6fe0803201297eaab7b170299174039"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.718734 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" event={"ID":"9ae83066-0d44-4bc0-9052-c255c65d821c","Type":"ContainerStarted","Data":"9ae3b32c8bfa6ae37fc733bc4ac04a281181bcb6147b3cc564df3af06836fd45"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.718954 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.719353 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.721660 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.722066 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.221372018 +0000 UTC m=+93.819966492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.726304 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" event={"ID":"7c1d214a-8055-4d3b-9131-c2c2510b2939","Type":"ContainerStarted","Data":"3e98435cad77bc0f4f74e6a75e39d2959fbb3712ca339c021a9025f0c1f0eacb"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.729426 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" event={"ID":"0b338d3c-19f3-42eb-bc21-89d971d0d38e","Type":"ContainerStarted","Data":"7f402b0bc4adf4414593dd4498f4bea96a3c021e8ccdc1d98752154c15924388"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.737133 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" event={"ID":"a3a43226-6c7b-43bf-a154-093348017ac8","Type":"ContainerStarted","Data":"ea66a54cbfef72a4d6137ce273b6ff08cb57be7d7750776b3c9f687533cce959"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.744769 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.759508 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-457cr" event={"ID":"00ce529b-1427-4c22-9c32-4f4a81aee646","Type":"ContainerStarted","Data":"ee67aab7b57edac534b73223a42cd8238163b298dd137c534f5c0776e500eb97"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.759568 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-457cr" event={"ID":"00ce529b-1427-4c22-9c32-4f4a81aee646","Type":"ContainerStarted","Data":"8488653fdc6cbd62dcac752437baf600b17cdc8ca1a6a59d0cffd66de7edc52a"} Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.760904 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.787845 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg"] Dec 03 18:56:12 crc kubenswrapper[4731]: W1203 18:56:12.793821 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11140349_9793_456a_8ac2_5bcd0e917ea1.slice/crio-f71cf364155a9073f63ef7b96a503cef1ed8778389c932c9238f40e9474cc996 WatchSource:0}: Error finding container f71cf364155a9073f63ef7b96a503cef1ed8778389c932c9238f40e9474cc996: Status 404 returned error can't find the container with id f71cf364155a9073f63ef7b96a503cef1ed8778389c932c9238f40e9474cc996 Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.820007 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.828160 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.328139677 +0000 UTC m=+93.926734141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.828198 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.843761 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.900946 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.931066 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:12 crc kubenswrapper[4731]: E1203 18:56:12.932157 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.432144216 +0000 UTC m=+94.030738680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.953648 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k"] Dec 03 18:56:12 crc kubenswrapper[4731]: I1203 18:56:12.989084 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bbd9p"] Dec 03 18:56:13 crc kubenswrapper[4731]: W1203 18:56:13.026302 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfebe3747_e9a5_4cde_84ee_2b7708794897.slice/crio-752ed909e14f95402aa3e46cade6fcf2b2bd093b8bf019ffd7a68a070e3505b7 WatchSource:0}: Error finding container 752ed909e14f95402aa3e46cade6fcf2b2bd093b8bf019ffd7a68a070e3505b7: Status 404 returned error can't find the container with id 752ed909e14f95402aa3e46cade6fcf2b2bd093b8bf019ffd7a68a070e3505b7 Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.032587 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.032831 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.532805485 +0000 UTC m=+94.131399949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.033063 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.036404 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.536386474 +0000 UTC m=+94.134980938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.046855 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.115752 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.134229 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.134546 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.634533132 +0000 UTC m=+94.233127596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.137584 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.139134 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwlrx"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.212421 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kh5lp"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.235382 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.235723 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.735712 +0000 UTC m=+94.334306464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.341627 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.342061 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.842047093 +0000 UTC m=+94.440641557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.342545 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.342824 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.84281663 +0000 UTC m=+94.441411094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.433877 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-457cr" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.443471 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.443986 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.943936725 +0000 UTC m=+94.542531189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.444221 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.444563 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:13.944549678 +0000 UTC m=+94.543144142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.498062 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-457cr" podStartSLOduration=69.498046991 podStartE2EDuration="1m9.498046991s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:13.496288608 +0000 UTC m=+94.094883082" watchObservedRunningTime="2025-12-03 18:56:13.498046991 +0000 UTC m=+94.096641455" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.540447 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.549002 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.549514 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.049492971 +0000 UTC m=+94.648087435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.560997 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5"] Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.652439 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.652881 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.152867647 +0000 UTC m=+94.751462121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.753284 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.753699 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.253684742 +0000 UTC m=+94.852279206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.810093 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwlrx" event={"ID":"f78ccef1-3b10-4bee-b8b7-19beedacb04b","Type":"ContainerStarted","Data":"22fc3ef69e3bf4010283413bb53395ea5d5bdba367f2bf059b4675921c9ddb0e"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.812274 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" event={"ID":"4893d018-3556-4292-9c3f-b7741732e8eb","Type":"ContainerStarted","Data":"93053afd1e972c7f1b000e68a07fce5ad65bd6fc296dd7fd69b7a088c06e62c6"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.826184 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" event={"ID":"d7b3325d-acb8-4fe7-a7cc-30df55f51b09","Type":"ContainerStarted","Data":"9d5bfed1d4459b3796bf754d8bed09223fe6e06289a9d762fcda3b4b86812af7"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.827217 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" event={"ID":"7c1d214a-8055-4d3b-9131-c2c2510b2939","Type":"ContainerStarted","Data":"f96aa8d45980a4ecae22df44f74ccfd6ab3fa223dcb04f114d9081054c1e1828"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.827897 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" event={"ID":"294f0919-18b7-42f8-8528-c3ada8d12d53","Type":"ContainerStarted","Data":"91413d33324cc60e52537cc3deff8650079e575522a378deafc24fa50ff05476"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.855959 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.856278 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.35626648 +0000 UTC m=+94.954860944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.899021 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" event={"ID":"a3a43226-6c7b-43bf-a154-093348017ac8","Type":"ContainerStarted","Data":"f7c5e5b973ec681919920a623d17a00a8246716e3fb457784a950a95cbc247df"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.899069 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" event={"ID":"a3a43226-6c7b-43bf-a154-093348017ac8","Type":"ContainerStarted","Data":"46cdabddd6dcc1fa0b51012573df7e2fd408ec4e9acec2d8c0f0a3d708bfdd8e"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.912707 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" event={"ID":"2528f79c-1409-4a38-9fef-a5b56cec0d3c","Type":"ContainerStarted","Data":"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.920571 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" event={"ID":"3b79ffaf-63b0-4a26-bf1a-654f53537a2b","Type":"ContainerStarted","Data":"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.920616 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" event={"ID":"3b79ffaf-63b0-4a26-bf1a-654f53537a2b","Type":"ContainerStarted","Data":"a4f6528dae6495c11bc0c1b9db2a22210b469b8b9a3c391fbe41a4f2e3e3756e"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.920964 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.930623 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.935202 4731 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wlqm9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.935268 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.956442 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.956586 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.456563775 +0000 UTC m=+95.055158239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.956749 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:13 crc kubenswrapper[4731]: E1203 18:56:13.958808 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.458794785 +0000 UTC m=+95.057389249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:13 crc kubenswrapper[4731]: W1203 18:56:13.959569 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7bee06_2bfb_4426_82a3_410ef3205bee.slice/crio-175fc3491d6edfc6220384d94a2729f1a40026891001aa2d5527020e7679017a WatchSource:0}: Error finding container 175fc3491d6edfc6220384d94a2729f1a40026891001aa2d5527020e7679017a: Status 404 returned error can't find the container with id 175fc3491d6edfc6220384d94a2729f1a40026891001aa2d5527020e7679017a Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.960991 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" event={"ID":"ca25a279-e208-48df-b048-c4c7fa91c9a2","Type":"ContainerStarted","Data":"ca5f99225cfdfadd8c1336f04d01dc1861aea9e7880333effad69cd8580f2e0b"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.978551 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" event={"ID":"0b338d3c-19f3-42eb-bc21-89d971d0d38e","Type":"ContainerStarted","Data":"ded56a85bedc3cc8893dfe5292b65e0fa2e51489d5e8e0ed4feb8e702cfaa899"} Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.981427 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7spwt" podStartSLOduration=69.981415119 podStartE2EDuration="1m9.981415119s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:13.981293554 +0000 UTC m=+94.579888038" watchObservedRunningTime="2025-12-03 18:56:13.981415119 +0000 UTC m=+94.580009583" Dec 03 18:56:13 crc kubenswrapper[4731]: I1203 18:56:13.994496 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" event={"ID":"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18","Type":"ContainerStarted","Data":"53da81aa8796734fa6e9a4711eeff4e8b6bcf71ab3868e2a3b4bf9e6d3909948"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.007275 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" podStartSLOduration=69.007242357 podStartE2EDuration="1m9.007242357s" podCreationTimestamp="2025-12-03 18:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.006278662 +0000 UTC m=+94.604873136" watchObservedRunningTime="2025-12-03 18:56:14.007242357 +0000 UTC m=+94.605836821" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.012643 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65n9c" event={"ID":"3033d290-d147-4727-8d61-0dabed08e76d","Type":"ContainerStarted","Data":"4231724b37da979aa08e9c07bf3a38a430e943c6d1a7f6ee2b3d8a8c4a159550"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.053722 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" event={"ID":"3fbd46a2-b020-4bae-9c09-579b17677c8f","Type":"ContainerStarted","Data":"5ffa9ef64d425c9ca0cbf5640c73bb964a6b4e9356356f888a70addf60ea8ec3"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.057927 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.061517 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.561491348 +0000 UTC m=+95.160085812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.069480 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" event={"ID":"ccad752d-e471-46b3-a898-aa85884563a7","Type":"ContainerStarted","Data":"3882a06e45df11bc5a62c1697c1e1e026fbb165c25919286c8d056d9264aa576"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.072352 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.073817 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.57380373 +0000 UTC m=+95.172398194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.142665 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb"] Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.174520 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2hrqb" podStartSLOduration=70.17450039 podStartE2EDuration="1m10.17450039s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.123191226 +0000 UTC m=+94.721785690" watchObservedRunningTime="2025-12-03 18:56:14.17450039 +0000 UTC m=+94.773094854" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.178303 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerStarted","Data":"c5e913793c2ff945cdbf0a3bbdd8119ae72e6d652dd847439f06c98de604653a"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.183183 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.186555 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.686536263 +0000 UTC m=+95.285130727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.226813 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" event={"ID":"7460fcca-b017-4d16-9d8a-8d3c4cc910e9","Type":"ContainerStarted","Data":"961f489212fb04d5813dc22800ffd1ce21cb915c5a0f4f3c0f1ea3ada8aea31b"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.238612 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b654x"] Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.246552 4731 generic.go:334] "Generic (PLEG): container finished" podID="8c6e4ce0-c1e7-4aa9-8b20-3949db22818e" containerID="43b4b67af7b4dde0a7ff5c404ce9ff45a9abeb219897afd083001d078e556b35" exitCode=0 Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.246641 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" event={"ID":"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e","Type":"ContainerDied","Data":"43b4b67af7b4dde0a7ff5c404ce9ff45a9abeb219897afd083001d078e556b35"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.284333 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.287441 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g"] Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.289895 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.789881169 +0000 UTC m=+95.388475633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.306871 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" event={"ID":"febe3747-e9a5-4cde-84ee-2b7708794897","Type":"ContainerStarted","Data":"752ed909e14f95402aa3e46cade6fcf2b2bd093b8bf019ffd7a68a070e3505b7"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.308567 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" podStartSLOduration=70.308555691 podStartE2EDuration="1m10.308555691s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.307127399 +0000 UTC m=+94.905721863" watchObservedRunningTime="2025-12-03 18:56:14.308555691 +0000 UTC m=+94.907150155" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.337461 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" event={"ID":"65e30694-94d5-4131-87ef-dfd2e1acec56","Type":"ContainerStarted","Data":"216db651e958a0e9d02aee0f23b9209c37119edd652ee5c2c2050d64d137a814"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.373376 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" event={"ID":"88affbdd-aaa2-449a-a1f7-f3bb203f176f","Type":"ContainerStarted","Data":"bc62492586ef786fdc3fd7085d30ee81b0b693e6866570a15c0a7545b41a0b5d"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.394717 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.395274 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:14.895238907 +0000 UTC m=+95.493833371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.425748 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jqx27"] Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.426082 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" event={"ID":"f6eaf9e7-49e9-4438-bd42-659e9f4df03f","Type":"ContainerStarted","Data":"cc178cdbeb34a8665d267f80f2d30316252563c8ef3e12d3aabcee49c0072624"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.426117 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" event={"ID":"f6eaf9e7-49e9-4438-bd42-659e9f4df03f","Type":"ContainerStarted","Data":"ead717897d03ffb6bd3afffddea4790457e08b7e4015cb92c2a9c64389b5ee26"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.449820 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" event={"ID":"b958e8b3-a0cb-4bb0-b051-befcfc7fad43","Type":"ContainerStarted","Data":"626688d1723baf9fa32da049e5ce56fac841416a622d0413f9612bad77084699"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.482273 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" event={"ID":"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad","Type":"ContainerStarted","Data":"85a4b057b09978eefeb89c08a0bae010f4d7e490e45438b5ce8a4d2e0111ec04"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.492238 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rnmc9" event={"ID":"149fd6d7-3127-457a-8d8b-03de32877b50","Type":"ContainerStarted","Data":"0ab9f8035a48d0cc43066069fba0508fc8ae25acc784b34967451a8d2d462e25"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.498097 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.504453 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.004422652 +0000 UTC m=+95.603017116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.525912 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zf9fq"] Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.541562 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" event={"ID":"53338979-623e-4f88-8f10-41d65be09af5","Type":"ContainerStarted","Data":"11369d648cd7fa8d2bb1cb88c6a5fbe3f8ce6aea4f98fca179092938db586d8b"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.587050 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" event={"ID":"120c759b-ba87-4faf-ac20-e8d340e845ac","Type":"ContainerStarted","Data":"4a408e0a260b550c79aa36229415d8f34d4a332b8ca957ba3f4195731492d898"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.598763 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.599003 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.098976592 +0000 UTC m=+95.697571056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.610078 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-75kd2" podStartSLOduration=70.610045949 podStartE2EDuration="1m10.610045949s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.604739439 +0000 UTC m=+95.203333903" watchObservedRunningTime="2025-12-03 18:56:14.610045949 +0000 UTC m=+95.208640413" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.616116 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.616535 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.116519542 +0000 UTC m=+95.715114006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.618143 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" event={"ID":"47f3df42-1762-4b23-a3b2-af671d9126df","Type":"ContainerStarted","Data":"c98445a51cb573628860f42cf100dada1767e3f8df581eb1fe5d3a1720dfbe81"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.619229 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.648697 4731 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdxt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" start-of-body= Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.648770 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" podUID="47f3df42-1762-4b23-a3b2-af671d9126df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" Dec 03 18:56:14 crc kubenswrapper[4731]: W1203 18:56:14.662023 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04233c6_8cf6_4450_aa74_cab1e1d12f4a.slice/crio-acf61a8fe004aa896b2622d2c3d3222934aa9392b23143b262a99232c9447e27 WatchSource:0}: Error finding container acf61a8fe004aa896b2622d2c3d3222934aa9392b23143b262a99232c9447e27: Status 404 returned error can't find the container with id acf61a8fe004aa896b2622d2c3d3222934aa9392b23143b262a99232c9447e27 Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.668409 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" event={"ID":"fe9f6fa9-efd7-4569-9c8b-c6932458eec7","Type":"ContainerStarted","Data":"7fb0ed03d60617cf59f7e47cc1fe4224411933a806950d9db4cd986c435dbc01"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.668562 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" event={"ID":"fe9f6fa9-efd7-4569-9c8b-c6932458eec7","Type":"ContainerStarted","Data":"5a114b875d4e21c9acd2d5a6f3a36e7bbbf2b5240ef8839d680c0355ee708706"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.700964 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4c4b117-c28a-484b-854a-0c2145d7d881" containerID="fb61cd9da7675e5c65d35b450b1b094bbd07e1990fd03bd6f6429e6713facb61" exitCode=0 Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.701177 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" event={"ID":"f4c4b117-c28a-484b-854a-0c2145d7d881","Type":"ContainerDied","Data":"fb61cd9da7675e5c65d35b450b1b094bbd07e1990fd03bd6f6429e6713facb61"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.701234 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" event={"ID":"f4c4b117-c28a-484b-854a-0c2145d7d881","Type":"ContainerStarted","Data":"8cf5c8a36ab5d6d0319acf23d08215ff1632bc755afe6f3bbe6da8e98c003d65"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.702966 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5w5k" podStartSLOduration=70.702936679 podStartE2EDuration="1m10.702936679s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.640881548 +0000 UTC m=+95.239476032" watchObservedRunningTime="2025-12-03 18:56:14.702936679 +0000 UTC m=+95.301531153" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.721460 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.727643 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.227619726 +0000 UTC m=+95.826214190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.769910 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" event={"ID":"11140349-9793-456a-8ac2-5bcd0e917ea1","Type":"ContainerStarted","Data":"f71cf364155a9073f63ef7b96a503cef1ed8778389c932c9238f40e9474cc996"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.828021 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q4mkw" event={"ID":"acf996a8-0dfa-4d25-9112-5a3c4688cb77","Type":"ContainerStarted","Data":"f90a99145b4c6bd190230dd2ec171ca8ab1f42c47984a208f343065cabe3c27b"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.830285 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.830990 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.330970832 +0000 UTC m=+95.929565296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.832318 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" podStartSLOduration=70.832300169 podStartE2EDuration="1m10.832300169s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.831768281 +0000 UTC m=+95.430362745" watchObservedRunningTime="2025-12-03 18:56:14.832300169 +0000 UTC m=+95.430894633" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.861124 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2mkdc" event={"ID":"ae92a1b7-7488-465a-bf52-0cdc4de799f3","Type":"ContainerStarted","Data":"d6ee464b3808032eef7fe3121df8a783b59bc5b409fcc6a3cf3300159c8a5456"} Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.861164 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.888455 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rnmc9" podStartSLOduration=5.888436488 podStartE2EDuration="5.888436488s" podCreationTimestamp="2025-12-03 18:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.887614809 +0000 UTC m=+95.486209283" watchObservedRunningTime="2025-12-03 18:56:14.888436488 +0000 UTC m=+95.487030952" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.900310 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.900376 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:14 crc kubenswrapper[4731]: I1203 18:56:14.933675 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:14 crc kubenswrapper[4731]: E1203 18:56:14.941568 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.441548677 +0000 UTC m=+96.040143141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.043913 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vvkjw" podStartSLOduration=71.043894037 podStartE2EDuration="1m11.043894037s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:14.984590065 +0000 UTC m=+95.583184529" watchObservedRunningTime="2025-12-03 18:56:15.043894037 +0000 UTC m=+95.642488501" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.045158 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.045709 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.545693852 +0000 UTC m=+96.144288316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.098185 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xplb6" podStartSLOduration=71.098142387 podStartE2EDuration="1m11.098142387s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:15.097780045 +0000 UTC m=+95.696374519" watchObservedRunningTime="2025-12-03 18:56:15.098142387 +0000 UTC m=+95.696736851" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.151996 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.152481 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.65246258 +0000 UTC m=+96.251057044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.194984 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-q4mkw" podStartSLOduration=71.194964508 podStartE2EDuration="1m11.194964508s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:15.194873085 +0000 UTC m=+95.793467549" watchObservedRunningTime="2025-12-03 18:56:15.194964508 +0000 UTC m=+95.793558972" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.222863 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2mkdc" podStartSLOduration=71.222839471 podStartE2EDuration="1m11.222839471s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:15.221310195 +0000 UTC m=+95.819904659" watchObservedRunningTime="2025-12-03 18:56:15.222839471 +0000 UTC m=+95.821433935" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.246978 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tj2gg" podStartSLOduration=71.246954237 podStartE2EDuration="1m11.246954237s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:15.244276881 +0000 UTC m=+95.842871335" watchObservedRunningTime="2025-12-03 18:56:15.246954237 +0000 UTC m=+95.845548701" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.253995 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.254360 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.754347344 +0000 UTC m=+96.352941808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.355153 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.355543 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.855527572 +0000 UTC m=+96.454122036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.458530 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.458922 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:15.958909808 +0000 UTC m=+96.557504272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.550962 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.553462 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.553631 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.559375 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.559594 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.059567177 +0000 UTC m=+96.658161641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.559695 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.560071 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.060053584 +0000 UTC m=+96.658648048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.660678 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.661143 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.161129277 +0000 UTC m=+96.759723741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.763316 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.763714 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.263696506 +0000 UTC m=+96.862290970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.864944 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.865804 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.365785725 +0000 UTC m=+96.964380189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.966528 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:15 crc kubenswrapper[4731]: E1203 18:56:15.968353 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.468338553 +0000 UTC m=+97.066933017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.984120 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" event={"ID":"b958e8b3-a0cb-4bb0-b051-befcfc7fad43","Type":"ContainerStarted","Data":"4c12296930156b33fa010f89344e23ee1dbd3c91198195f7f9ba9788976a220d"} Dec 03 18:56:15 crc kubenswrapper[4731]: I1203 18:56:15.984166 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" event={"ID":"b958e8b3-a0cb-4bb0-b051-befcfc7fad43","Type":"ContainerStarted","Data":"859a7eccebb374f5896c2fd1d39187cfabcf82fe3ac1911d92dd1509ac5c0c98"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.036192 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" podStartSLOduration=71.036174771 podStartE2EDuration="1m11.036174771s" podCreationTimestamp="2025-12-03 18:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:15.280484303 +0000 UTC m=+95.879078767" watchObservedRunningTime="2025-12-03 18:56:16.036174771 +0000 UTC m=+96.634769225" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.065459 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65n9c" event={"ID":"3033d290-d147-4727-8d61-0dabed08e76d","Type":"ContainerStarted","Data":"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.074304 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.075431 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.567888702 +0000 UTC m=+97.166483166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.075712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.076171 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.576160229 +0000 UTC m=+97.174754703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.120790 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9cslg" podStartSLOduration=72.120773653 podStartE2EDuration="1m12.120773653s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.037752598 +0000 UTC m=+96.636347062" watchObservedRunningTime="2025-12-03 18:56:16.120773653 +0000 UTC m=+96.719368107" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.121411 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-65n9c" podStartSLOduration=72.121407516 podStartE2EDuration="1m12.121407516s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.115879627 +0000 UTC m=+96.714474101" watchObservedRunningTime="2025-12-03 18:56:16.121407516 +0000 UTC m=+96.720001980" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.164244 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" event={"ID":"f2c36b28-04ab-4880-a79e-11d47a6fe1e6","Type":"ContainerStarted","Data":"68ef6c2d5045615da93c0dccabfd639f0baf6357ff63da34bd9ce18233736f2d"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.164321 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" event={"ID":"f2c36b28-04ab-4880-a79e-11d47a6fe1e6","Type":"ContainerStarted","Data":"e104a00c9c5cf440d13f7abf64a52deb300ddb241f49e7694cc1464626a2c83c"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.183177 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" event={"ID":"88affbdd-aaa2-449a-a1f7-f3bb203f176f","Type":"ContainerStarted","Data":"0c2b9f9b47123771aecf417a556c61f56154f4621a260a43035e4b7cc11cee7d"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.184840 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.185635 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.685601184 +0000 UTC m=+97.284195648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.185925 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.187712 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.687697729 +0000 UTC m=+97.286292193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.196515 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" event={"ID":"65e30694-94d5-4131-87ef-dfd2e1acec56","Type":"ContainerStarted","Data":"c3eda4bb36e38ac697c478724ea86aaeee707390c66d11834e8f14763d8bd4eb"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.216909 4731 generic.go:334] "Generic (PLEG): container finished" podID="4893d018-3556-4292-9c3f-b7741732e8eb" containerID="93053afd1e972c7f1b000e68a07fce5ad65bd6fc296dd7fd69b7a088c06e62c6" exitCode=0 Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.216979 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" event={"ID":"4893d018-3556-4292-9c3f-b7741732e8eb","Type":"ContainerDied","Data":"93053afd1e972c7f1b000e68a07fce5ad65bd6fc296dd7fd69b7a088c06e62c6"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.230294 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" event={"ID":"3fcf8af5-8fdc-458e-ae09-1df6a2eaa1ad","Type":"ContainerStarted","Data":"12f9137eb80a0938e2a4eef6d2dd2e030a057f35e835ed8b3088dc50c293edec"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.250083 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" event={"ID":"53338979-623e-4f88-8f10-41d65be09af5","Type":"ContainerStarted","Data":"52c652e67ac4b1c62185f8175cc9c8a84c0728f505c0c9d317a819ae0d2696a3"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.267165 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b654x" podStartSLOduration=71.267149206 podStartE2EDuration="1m11.267149206s" podCreationTimestamp="2025-12-03 18:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.219191891 +0000 UTC m=+96.817786355" watchObservedRunningTime="2025-12-03 18:56:16.267149206 +0000 UTC m=+96.865743670" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.272045 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" event={"ID":"d856a441-2665-48cd-aed6-d48c9ff0f3c4","Type":"ContainerStarted","Data":"05b846112331eecc2abcfef9ca3e5d1c21b4206a09f01493ed4c2b4c49194852"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.272110 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" event={"ID":"d856a441-2665-48cd-aed6-d48c9ff0f3c4","Type":"ContainerStarted","Data":"601836420fae73e3e6b629ff21b5df1aa0ffa056f51553a99c306c7789bc46d0"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.273551 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.283045 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" event={"ID":"7c1d214a-8055-4d3b-9131-c2c2510b2939","Type":"ContainerStarted","Data":"0a6f51c8f9493d8583562d3f32755a22d561f6cae57f0b4b41b653ec9ad44341"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.287674 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.288366 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.788341157 +0000 UTC m=+97.386935621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.288713 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.290803 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.790792655 +0000 UTC m=+97.389387109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.310060 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xrx95" podStartSLOduration=72.310038318 podStartE2EDuration="1m12.310038318s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.266609196 +0000 UTC m=+96.865203660" watchObservedRunningTime="2025-12-03 18:56:16.310038318 +0000 UTC m=+96.908632782" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.319930 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" event={"ID":"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e","Type":"ContainerStarted","Data":"f7a578c4246294e4ae935d628c0bf14c5e93ada351294334a116923b1a1288f1"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.340581 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwlrx" event={"ID":"f78ccef1-3b10-4bee-b8b7-19beedacb04b","Type":"ContainerStarted","Data":"e5104ad29d99e2ddfa58f12a4acdca2a53949657923ff48b88cec596b7943134"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.345450 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fx564" podStartSLOduration=72.34543258 podStartE2EDuration="1m12.34543258s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.344729905 +0000 UTC m=+96.943324369" watchObservedRunningTime="2025-12-03 18:56:16.34543258 +0000 UTC m=+96.944027054" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.359438 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" event={"ID":"f0b43134-d5ba-4f4b-bd4b-e5a838d23b18","Type":"ContainerStarted","Data":"404a786fdbcda9c12ba2e302f50ba9d1ec514fa6444638d66869046c922c1645"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.363073 4731 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-z4xtb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.363117 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" podUID="d856a441-2665-48cd-aed6-d48c9ff0f3c4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.377582 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q4mkw" event={"ID":"acf996a8-0dfa-4d25-9112-5a3c4688cb77","Type":"ContainerStarted","Data":"9282c72f45d0853696d6a911586211677ef13aa13299843ca829aae06ddafb44"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.392113 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.392138 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-drbj7" podStartSLOduration=72.392123828 podStartE2EDuration="1m12.392123828s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.391713223 +0000 UTC m=+96.990307697" watchObservedRunningTime="2025-12-03 18:56:16.392123828 +0000 UTC m=+96.990718292" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.401709 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.901682962 +0000 UTC m=+97.500277426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.402098 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.403642 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:16.903634542 +0000 UTC m=+97.502229006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.416109 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" event={"ID":"7460fcca-b017-4d16-9d8a-8d3c4cc910e9","Type":"ContainerStarted","Data":"5dff1676d08785214308d3cba6cf81f57fdc2445340a70b2e52139acd8b89c64"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.417972 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" event={"ID":"febe3747-e9a5-4cde-84ee-2b7708794897","Type":"ContainerStarted","Data":"ff6461b140834b492e029059ea1059adb4410fc6ca1ff76ef0bce9e937d4f56f"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.418002 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" event={"ID":"febe3747-e9a5-4cde-84ee-2b7708794897","Type":"ContainerStarted","Data":"a8fb9cd103701e92495e9bb57683a945d8198be6dcb748487e784583bcfe4a67"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.418455 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.424493 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" event={"ID":"ef7bee06-2bfb-4426-82a3-410ef3205bee","Type":"ContainerStarted","Data":"ca85c3c5d3e8bb64644354b925426b3c7602bfd0f3bdd2a203c17ca1b0d0536d"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.424544 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" event={"ID":"ef7bee06-2bfb-4426-82a3-410ef3205bee","Type":"ContainerStarted","Data":"175fc3491d6edfc6220384d94a2729f1a40026891001aa2d5527020e7679017a"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.425598 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.459190 4731 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-knsx5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.459275 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" podUID="ef7bee06-2bfb-4426-82a3-410ef3205bee" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.462191 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xwlrx" podStartSLOduration=7.4621787170000005 podStartE2EDuration="7.462178717s" podCreationTimestamp="2025-12-03 18:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.460566979 +0000 UTC m=+97.059161443" watchObservedRunningTime="2025-12-03 18:56:16.462178717 +0000 UTC m=+97.060773191" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.462566 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kk955" podStartSLOduration=72.462561951 podStartE2EDuration="1m12.462561951s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.421438493 +0000 UTC m=+97.020032967" watchObservedRunningTime="2025-12-03 18:56:16.462561951 +0000 UTC m=+97.061156415" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.483961 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerStarted","Data":"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.485479 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.499405 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d9ltn" podStartSLOduration=72.499391215 podStartE2EDuration="1m12.499391215s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.499002101 +0000 UTC m=+97.097596575" watchObservedRunningTime="2025-12-03 18:56:16.499391215 +0000 UTC m=+97.097985679" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.503794 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.505081 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.005064959 +0000 UTC m=+97.603659423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.518985 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.519457 4731 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-srnnr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.519517 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.549545 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" event={"ID":"3fbd46a2-b020-4bae-9c09-579b17677c8f","Type":"ContainerStarted","Data":"bddc7b1965bc825efa575f35c34ec1ed8582e045aa6ec299dc9e743f3dfe3066"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.549613 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" event={"ID":"3fbd46a2-b020-4bae-9c09-579b17677c8f","Type":"ContainerStarted","Data":"f2d65ca85bb74fdb770601d6b62a785d477c3e8d3d66b805eda38d9f7ffcd786"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.574478 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" podStartSLOduration=72.574448343 podStartE2EDuration="1m12.574448343s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.563628975 +0000 UTC m=+97.162223439" watchObservedRunningTime="2025-12-03 18:56:16.574448343 +0000 UTC m=+97.173042807" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.583371 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:16 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:16 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:16 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.583440 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.605192 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.608480 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.108466487 +0000 UTC m=+97.707060951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.613128 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" event={"ID":"0f05323b-fadc-4481-a4b9-112753505b1e","Type":"ContainerStarted","Data":"96628e1c5eb0edd2a347c22bb502cefaeb71d3bff5bf920999db71a5a570d97f"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.620562 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k7gjd" podStartSLOduration=72.620538471 podStartE2EDuration="1m12.620538471s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.61800552 +0000 UTC m=+97.216599984" watchObservedRunningTime="2025-12-03 18:56:16.620538471 +0000 UTC m=+97.219132935" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.673799 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" event={"ID":"858f1146-e313-4cc2-a195-01fa1a85e62e","Type":"ContainerStarted","Data":"5c22fe87291828b4e7173882d6d88b6de73698a561f4a373775ca2b7c7824631"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.673845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" event={"ID":"858f1146-e313-4cc2-a195-01fa1a85e62e","Type":"ContainerStarted","Data":"b22eb2d10420e855916eeb02aa41347461fbb30ae75429520ffddd77f294e240"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.709056 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.710308 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.210290088 +0000 UTC m=+97.808884562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.712903 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" event={"ID":"ca25a279-e208-48df-b048-c4c7fa91c9a2","Type":"ContainerStarted","Data":"9582c2e16804d05314149dbb980ff617b69062d4749d955029d3752a7ca56e42"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.810917 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.813288 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.31327225 +0000 UTC m=+97.911866714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.823537 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" podStartSLOduration=72.823521188 podStartE2EDuration="1m12.823521188s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.711439138 +0000 UTC m=+97.310033602" watchObservedRunningTime="2025-12-03 18:56:16.823521188 +0000 UTC m=+97.422115652" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.824172 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" podStartSLOduration=72.824166341 podStartE2EDuration="1m12.824166341s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.822005334 +0000 UTC m=+97.420599798" watchObservedRunningTime="2025-12-03 18:56:16.824166341 +0000 UTC m=+97.422760815" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.858917 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" event={"ID":"47f3df42-1762-4b23-a3b2-af671d9126df","Type":"ContainerStarted","Data":"a4b37defcb5e851f22df3ae065ced8edc18733007f682592d37ce6a591fb4ac8"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.882930 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" podStartSLOduration=72.882906703 podStartE2EDuration="1m12.882906703s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.879915006 +0000 UTC m=+97.478509470" watchObservedRunningTime="2025-12-03 18:56:16.882906703 +0000 UTC m=+97.481501167" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.895771 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" event={"ID":"7b586f69-0fbe-4e39-b741-db5e4fd70d21","Type":"ContainerStarted","Data":"ca206a4fe3a37b8a34bb7e4aff0094cb258c793ff9a43ca21f901e813681fc70"} Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.915601 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:16 crc kubenswrapper[4731]: E1203 18:56:16.927227 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.427201906 +0000 UTC m=+98.025796370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.972175 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgf5k" podStartSLOduration=72.972157242 podStartE2EDuration="1m12.972157242s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.928377548 +0000 UTC m=+97.526972012" watchObservedRunningTime="2025-12-03 18:56:16.972157242 +0000 UTC m=+97.570751696" Dec 03 18:56:16 crc kubenswrapper[4731]: I1203 18:56:16.988634 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" event={"ID":"d7b3325d-acb8-4fe7-a7cc-30df55f51b09","Type":"ContainerStarted","Data":"b8d19e6bcaf807408c2774711e3f41e86d9e336eab9f605ac29f4f48d5367a1d"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.017552 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.018652 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.518634892 +0000 UTC m=+98.117229346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.019163 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" event={"ID":"294f0919-18b7-42f8-8528-c3ada8d12d53","Type":"ContainerStarted","Data":"fabaadcaeacad1a67a5f48c25afe1cd2186bd13b04f50e08841e788b0948f72d"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.024756 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bbd9p" podStartSLOduration=72.024736762 podStartE2EDuration="1m12.024736762s" podCreationTimestamp="2025-12-03 18:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:16.972493814 +0000 UTC m=+97.571088278" watchObservedRunningTime="2025-12-03 18:56:17.024736762 +0000 UTC m=+97.623331226" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.025622 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" podStartSLOduration=73.025616894 podStartE2EDuration="1m13.025616894s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.024244364 +0000 UTC m=+97.622838838" watchObservedRunningTime="2025-12-03 18:56:17.025616894 +0000 UTC m=+97.624211358" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.050868 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" podStartSLOduration=73.050853501 podStartE2EDuration="1m13.050853501s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.047925716 +0000 UTC m=+97.646520180" watchObservedRunningTime="2025-12-03 18:56:17.050853501 +0000 UTC m=+97.649447965" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.082618 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" event={"ID":"11140349-9793-456a-8ac2-5bcd0e917ea1","Type":"ContainerStarted","Data":"c7a36330d4f2af28d787df8421a3503944c7bc1e6f91ad8fb87e93637203e9bd"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.082683 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" event={"ID":"11140349-9793-456a-8ac2-5bcd0e917ea1","Type":"ContainerStarted","Data":"a7ba5a57dc9d18e9a4102cb7f465843b63e7c4ec96e1a856a8280c08fc0204d5"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.083773 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" podStartSLOduration=73.083755434 podStartE2EDuration="1m13.083755434s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.082616904 +0000 UTC m=+97.681211368" watchObservedRunningTime="2025-12-03 18:56:17.083755434 +0000 UTC m=+97.682349898" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.109640 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf9fq" event={"ID":"d04233c6-8cf6-4450-aa74-cab1e1d12f4a","Type":"ContainerStarted","Data":"acf61a8fe004aa896b2622d2c3d3222934aa9392b23143b262a99232c9447e27"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.111187 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" podStartSLOduration=73.111166759 podStartE2EDuration="1m13.111166759s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.110022349 +0000 UTC m=+97.708616823" watchObservedRunningTime="2025-12-03 18:56:17.111166759 +0000 UTC m=+97.709761223" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.119116 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.119709 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.619696246 +0000 UTC m=+98.218290710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.161991 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rnmc9" event={"ID":"149fd6d7-3127-457a-8d8b-03de32877b50","Type":"ContainerStarted","Data":"356d32015f3cdf75e6c5411d623841344c4f58b97640fc01f22e543c944f7c38"} Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.165437 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.165624 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.174522 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" podStartSLOduration=73.174506587 podStartE2EDuration="1m13.174506587s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.173073875 +0000 UTC m=+97.771668339" watchObservedRunningTime="2025-12-03 18:56:17.174506587 +0000 UTC m=+97.773101051" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.175597 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" podStartSLOduration=73.175591125 podStartE2EDuration="1m13.175591125s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.136077886 +0000 UTC m=+97.734672350" watchObservedRunningTime="2025-12-03 18:56:17.175591125 +0000 UTC m=+97.774185589" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.189832 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.220038 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2pc2" podStartSLOduration=73.220018823 podStartE2EDuration="1m13.220018823s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:17.217107849 +0000 UTC m=+97.815702313" watchObservedRunningTime="2025-12-03 18:56:17.220018823 +0000 UTC m=+97.818613277" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.221030 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.221878 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.721864669 +0000 UTC m=+98.320459133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.325781 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.327084 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.827064851 +0000 UTC m=+98.425659315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.428875 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.429398 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:17.92938175 +0000 UTC m=+98.527976214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.529567 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.529978 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.029938175 +0000 UTC m=+98.628532639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.563548 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:17 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:17 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:17 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.563607 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.631903 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.632231 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.132219152 +0000 UTC m=+98.730813616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.734192 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.734530 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.23451413 +0000 UTC m=+98.833108594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.835985 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.837036 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.337013635 +0000 UTC m=+98.935608099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.860775 4731 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdxt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.861120 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" podUID="47f3df42-1762-4b23-a3b2-af671d9126df" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.18:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.937955 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:17 crc kubenswrapper[4731]: E1203 18:56:17.938395 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.438360089 +0000 UTC m=+99.036954553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.972885 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.973943 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.987603 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.991536 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 18:56:17 crc kubenswrapper[4731]: I1203 18:56:17.994681 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.039440 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.039517 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.039625 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.040077 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.540061055 +0000 UTC m=+99.138655519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.140626 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.140872 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.141032 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.141129 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.141241 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.641216522 +0000 UTC m=+99.239810986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.177186 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.201245 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v7ln2" event={"ID":"858f1146-e313-4cc2-a195-01fa1a85e62e","Type":"ContainerStarted","Data":"69bf2261a0a329ae3e940a972f65b033546b01d08bb553d77b33f80e79f6cbfb"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.222043 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" event={"ID":"8c6e4ce0-c1e7-4aa9-8b20-3949db22818e","Type":"ContainerStarted","Data":"068d9920d36668f6475345353bb7832f096b3be6d32bb2d6241ad0493dd1dd51"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.235067 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" event={"ID":"4893d018-3556-4292-9c3f-b7741732e8eb","Type":"ContainerStarted","Data":"895c601617b648324ef339c231ae29d49e6fd4eac6a82e647b3e2efa472fc47f"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.242380 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.242821 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.742800704 +0000 UTC m=+99.341395168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.248083 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" event={"ID":"7b586f69-0fbe-4e39-b741-db5e4fd70d21","Type":"ContainerStarted","Data":"1929bc4ebe6d9d40607e403c1ab9f1c6f885d1a4550bec9673faa9fe04fe87fb"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.248121 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lb29g" event={"ID":"7b586f69-0fbe-4e39-b741-db5e4fd70d21","Type":"ContainerStarted","Data":"3cde5204e17bbb54a3fb06d94e23c44d5e81ce21bf12831061124a61f153d79a"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.250055 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" event={"ID":"0f05323b-fadc-4481-a4b9-112753505b1e","Type":"ContainerStarted","Data":"6d38daab81805ca31a5e54ecb62ab7644d9a79f42ba74727ac91e29eb85f3333"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.251116 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kh5lp" event={"ID":"d7b3325d-acb8-4fe7-a7cc-30df55f51b09","Type":"ContainerStarted","Data":"e57d16365529ccf0fe517def0f89abc4f6da98bc8ee56e341f4920a3555be4ef"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.268477 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf9fq" event={"ID":"d04233c6-8cf6-4450-aa74-cab1e1d12f4a","Type":"ContainerStarted","Data":"0884864a5d37fc88b5c6723bec335da6ee443e308632e6d68454b688f03a7aa7"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.268753 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf9fq" event={"ID":"d04233c6-8cf6-4450-aa74-cab1e1d12f4a","Type":"ContainerStarted","Data":"8d2654f135894677ed9147b2206e648147145c5556415089b4237fb38138095d"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.269443 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.293748 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" podStartSLOduration=74.293711814 podStartE2EDuration="1m14.293711814s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:18.287323184 +0000 UTC m=+98.885917658" watchObservedRunningTime="2025-12-03 18:56:18.293711814 +0000 UTC m=+98.892306288" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.294466 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.307003 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mbm7g" event={"ID":"ca25a279-e208-48df-b048-c4c7fa91c9a2","Type":"ContainerStarted","Data":"a402974c9e2c05a00f569765cb617cf77491b28193c9ebc46e240118e05e1455"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.311845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" event={"ID":"f4c4b117-c28a-484b-854a-0c2145d7d881","Type":"ContainerStarted","Data":"2ed2118297f250416cff0f33c2102f300ebade375776b186a5c7a255118fb285"} Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.314691 4731 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-srnnr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.314735 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.327912 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" podStartSLOduration=74.327893693 podStartE2EDuration="1m14.327893693s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:18.3266992 +0000 UTC m=+98.925293664" watchObservedRunningTime="2025-12-03 18:56:18.327893693 +0000 UTC m=+98.926488157" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.333696 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knsx5" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.334636 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z4xtb" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.343716 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.344360 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.844341954 +0000 UTC m=+99.442936418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.345455 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.352693 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.852671134 +0000 UTC m=+99.451265778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.399368 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zf9fq" podStartSLOduration=9.399346652 podStartE2EDuration="9.399346652s" podCreationTimestamp="2025-12-03 18:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:18.375742093 +0000 UTC m=+98.974336557" watchObservedRunningTime="2025-12-03 18:56:18.399346652 +0000 UTC m=+98.997941116" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.410727 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdxt" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.446742 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.455053 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:18.955012552 +0000 UTC m=+99.553607016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.553186 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.553786 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.053766673 +0000 UTC m=+99.652361137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.560842 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:18 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:18 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:18 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.560935 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.664865 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.665524 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.165508311 +0000 UTC m=+99.764102775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.728500 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hl427" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.770961 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.771339 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.271326665 +0000 UTC m=+99.869921129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.853180 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.859969 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.873619 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.874387 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.37436683 +0000 UTC m=+99.972961304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.879542 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.908809 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.976904 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.976944 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.977065 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:18 crc kubenswrapper[4731]: I1203 18:56:18.977102 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56l7\" (UniqueName: \"kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:18 crc kubenswrapper[4731]: E1203 18:56:18.977485 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.477470086 +0000 UTC m=+100.076064550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.042318 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.043718 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.047647 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.068588 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.079831 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.080036 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.080100 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.080182 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l56l7\" (UniqueName: \"kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.080623 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.580608584 +0000 UTC m=+100.179203048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.081023 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.081264 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.167006 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.179086 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l56l7\" (UniqueName: \"kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7\") pod \"certified-operators-zkdtc\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.181776 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.181870 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.181931 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmq9j\" (UniqueName: \"kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.181996 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.182519 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.682485667 +0000 UTC m=+100.281080131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.290401 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291038 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291238 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291290 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmq9j\" (UniqueName: \"kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291326 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.291447 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.791424784 +0000 UTC m=+100.390019338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291686 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.291857 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.294879 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.295810 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.377339 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.391101 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" event={"ID":"0f05323b-fadc-4481-a4b9-112753505b1e","Type":"ContainerStarted","Data":"5f555d7be3078225954497b722b4f296fb5a209ff62f3d36b1d49114f0328890"} Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.391159 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmq9j\" (UniqueName: \"kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j\") pod \"community-operators-jz9nw\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.406012 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlzjk\" (UniqueName: \"kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.406067 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.406133 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.406154 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.406527 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:19.906514382 +0000 UTC m=+100.505108846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.428845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cf59d42-b5d1-49f8-87a8-8230f17d36b8","Type":"ContainerStarted","Data":"cd4ec1801810bc138156204bd2699b5b6f45ee1fa7cc948dae2864c94ee2fdf5"} Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.466268 4731 generic.go:334] "Generic (PLEG): container finished" podID="294f0919-18b7-42f8-8528-c3ada8d12d53" containerID="fabaadcaeacad1a67a5f48c25afe1cd2186bd13b04f50e08841e788b0948f72d" exitCode=0 Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.467439 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" event={"ID":"294f0919-18b7-42f8-8528-c3ada8d12d53","Type":"ContainerDied","Data":"fabaadcaeacad1a67a5f48c25afe1cd2186bd13b04f50e08841e788b0948f72d"} Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.476556 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.479171 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.480272 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.507112 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.507110 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.507178 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.00716389 +0000 UTC m=+100.605758354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.507951 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlzjk\" (UniqueName: \"kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508038 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508361 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508394 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508497 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9zs\" (UniqueName: \"kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508556 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.508686 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.511074 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.514040 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.014023546 +0000 UTC m=+100.612618230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.517277 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.564521 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:19 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:19 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:19 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.564615 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.580347 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlzjk\" (UniqueName: \"kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk\") pod \"certified-operators-cl7ct\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.611528 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.612205 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh9zs\" (UniqueName: \"kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.612279 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.612329 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.612935 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.613034 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.113013725 +0000 UTC m=+100.711608189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.613732 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.614879 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.670507 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.691865 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh9zs\" (UniqueName: \"kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs\") pod \"community-operators-r88d7\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.717069 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.717520 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.217504421 +0000 UTC m=+100.816098885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.731859 4731 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.807558 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.819944 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.820432 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.320412422 +0000 UTC m=+100.919006886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:19 crc kubenswrapper[4731]: I1203 18:56:19.925662 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:19 crc kubenswrapper[4731]: E1203 18:56:19.926029 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.426016358 +0000 UTC m=+101.024610822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlh5v" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.030382 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:20 crc kubenswrapper[4731]: E1203 18:56:20.030836 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 18:56:20.530812976 +0000 UTC m=+101.129407440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.031850 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.059964 4731 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T18:56:19.731890709Z","Handler":null,"Name":""} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.083314 4731 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.083349 4731 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.134650 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.137203 4731 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.137237 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.185943 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlh5v\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.236097 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.239944 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.266584 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.343043 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.349006 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:56:20 crc kubenswrapper[4731]: W1203 18:56:20.373922 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7420acab_728e_4fa3_b4e0_49db517c4018.slice/crio-01967f255f0916532ee66abad481a685e00488ca94ac2086d83d2ef7a04e763b WatchSource:0}: Error finding container 01967f255f0916532ee66abad481a685e00488ca94ac2086d83d2ef7a04e763b: Status 404 returned error can't find the container with id 01967f255f0916532ee66abad481a685e00488ca94ac2086d83d2ef7a04e763b Dec 03 18:56:20 crc kubenswrapper[4731]: W1203 18:56:20.376454 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169fb2cd_829d_4f3e_8a08_33c431d6c3d1.slice/crio-7c72073ca8b48ecf40d98be3dad1df3839a2a7d8a528b24c479bd8b1e2ff0d76 WatchSource:0}: Error finding container 7c72073ca8b48ecf40d98be3dad1df3839a2a7d8a528b24c479bd8b1e2ff0d76: Status 404 returned error can't find the container with id 7c72073ca8b48ecf40d98be3dad1df3839a2a7d8a528b24c479bd8b1e2ff0d76 Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.477991 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.478441 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerStarted","Data":"7c72073ca8b48ecf40d98be3dad1df3839a2a7d8a528b24c479bd8b1e2ff0d76"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.488747 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cf59d42-b5d1-49f8-87a8-8230f17d36b8","Type":"ContainerStarted","Data":"8171a7eb72234817362664908a2851160acf85dd0bcf60aadeb0fa71e45d59f0"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.494206 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerStarted","Data":"01967f255f0916532ee66abad481a685e00488ca94ac2086d83d2ef7a04e763b"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.496782 4731 generic.go:334] "Generic (PLEG): container finished" podID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerID="3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6" exitCode=0 Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.497558 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerDied","Data":"3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.497585 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerStarted","Data":"b498ecb9c3f87d2efc1ab1a9bb50528be384f4356bc8a8064287ae1155573fdf"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.499020 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.505987 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.505972479 podStartE2EDuration="3.505972479s" podCreationTimestamp="2025-12-03 18:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:20.502749663 +0000 UTC m=+101.101344137" watchObservedRunningTime="2025-12-03 18:56:20.505972479 +0000 UTC m=+101.104566943" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.511619 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" event={"ID":"0f05323b-fadc-4481-a4b9-112753505b1e","Type":"ContainerStarted","Data":"56c7b6a2ce5f7e34cdfd26c85c08ba7272478f6c2bc58288adb56f7bc111e28c"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.511660 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" event={"ID":"0f05323b-fadc-4481-a4b9-112753505b1e","Type":"ContainerStarted","Data":"bf68ed5c019b1c6692fa7ffca1c91adeebc1dd0ed07888011caced557987f2f9"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.513766 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerStarted","Data":"b030958786cc6820a51d47e972b51fd1d665ca07e1f7d2920cb669245f7b95c9"} Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.577888 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:20 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:20 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:20 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.577965 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.579547 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jqx27" podStartSLOduration=11.579527614 podStartE2EDuration="11.579527614s" podCreationTimestamp="2025-12-03 18:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:20.578649852 +0000 UTC m=+101.177244326" watchObservedRunningTime="2025-12-03 18:56:20.579527614 +0000 UTC m=+101.178122068" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.875886 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.960855 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume\") pod \"294f0919-18b7-42f8-8528-c3ada8d12d53\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.960931 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume\") pod \"294f0919-18b7-42f8-8528-c3ada8d12d53\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.961004 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz2w6\" (UniqueName: \"kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6\") pod \"294f0919-18b7-42f8-8528-c3ada8d12d53\" (UID: \"294f0919-18b7-42f8-8528-c3ada8d12d53\") " Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.962487 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume" (OuterVolumeSpecName: "config-volume") pod "294f0919-18b7-42f8-8528-c3ada8d12d53" (UID: "294f0919-18b7-42f8-8528-c3ada8d12d53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.969094 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "294f0919-18b7-42f8-8528-c3ada8d12d53" (UID: "294f0919-18b7-42f8-8528-c3ada8d12d53"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.969566 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6" (OuterVolumeSpecName: "kube-api-access-bz2w6") pod "294f0919-18b7-42f8-8528-c3ada8d12d53" (UID: "294f0919-18b7-42f8-8528-c3ada8d12d53"). InnerVolumeSpecName "kube-api-access-bz2w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:56:20 crc kubenswrapper[4731]: I1203 18:56:20.989611 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 18:56:20 crc kubenswrapper[4731]: W1203 18:56:20.999781 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63540dce_b2ef_48ab_9aad_dc0afcbec369.slice/crio-9688aa614828f39713fce947ff6fcab8e091ce6e0a330a12b13ade1829e390dc WatchSource:0}: Error finding container 9688aa614828f39713fce947ff6fcab8e091ce6e0a330a12b13ade1829e390dc: Status 404 returned error can't find the container with id 9688aa614828f39713fce947ff6fcab8e091ce6e0a330a12b13ade1829e390dc Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.031741 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:56:21 crc kubenswrapper[4731]: E1203 18:56:21.031983 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294f0919-18b7-42f8-8528-c3ada8d12d53" containerName="collect-profiles" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.031995 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="294f0919-18b7-42f8-8528-c3ada8d12d53" containerName="collect-profiles" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.032124 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="294f0919-18b7-42f8-8528-c3ada8d12d53" containerName="collect-profiles" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.033033 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.038313 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.042972 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.063993 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz2w6\" (UniqueName: \"kubernetes.io/projected/294f0919-18b7-42f8-8528-c3ada8d12d53-kube-api-access-bz2w6\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.064026 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/294f0919-18b7-42f8-8528-c3ada8d12d53-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.064035 4731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/294f0919-18b7-42f8-8528-c3ada8d12d53-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.165038 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.165422 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvrw\" (UniqueName: \"kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.165471 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.266481 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvrw\" (UniqueName: \"kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.266562 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.266608 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.267524 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.268299 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.305732 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvrw\" (UniqueName: \"kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw\") pod \"redhat-marketplace-m9jch\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.379393 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.434465 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.435880 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.447286 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.519997 4731 generic.go:334] "Generic (PLEG): container finished" podID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerID="247d101e342a5bdb4fd04153a7df7b19cf0d3bbd9a187042e9ccb4a10c9b4a60" exitCode=0 Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.520069 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerDied","Data":"247d101e342a5bdb4fd04153a7df7b19cf0d3bbd9a187042e9ccb4a10c9b4a60"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.526494 4731 generic.go:334] "Generic (PLEG): container finished" podID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerID="f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09" exitCode=0 Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.526590 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerDied","Data":"f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.529626 4731 generic.go:334] "Generic (PLEG): container finished" podID="1cf59d42-b5d1-49f8-87a8-8230f17d36b8" containerID="8171a7eb72234817362664908a2851160acf85dd0bcf60aadeb0fa71e45d59f0" exitCode=0 Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.529679 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cf59d42-b5d1-49f8-87a8-8230f17d36b8","Type":"ContainerDied","Data":"8171a7eb72234817362664908a2851160acf85dd0bcf60aadeb0fa71e45d59f0"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.533779 4731 generic.go:334] "Generic (PLEG): container finished" podID="7420acab-728e-4fa3-b4e0-49db517c4018" containerID="5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414" exitCode=0 Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.533840 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerDied","Data":"5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.542030 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" event={"ID":"294f0919-18b7-42f8-8528-c3ada8d12d53","Type":"ContainerDied","Data":"91413d33324cc60e52537cc3deff8650079e575522a378deafc24fa50ff05476"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.542116 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91413d33324cc60e52537cc3deff8650079e575522a378deafc24fa50ff05476" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.542047 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.545332 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" event={"ID":"63540dce-b2ef-48ab-9aad-dc0afcbec369","Type":"ContainerStarted","Data":"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.545369 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.545381 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" event={"ID":"63540dce-b2ef-48ab-9aad-dc0afcbec369","Type":"ContainerStarted","Data":"9688aa614828f39713fce947ff6fcab8e091ce6e0a330a12b13ade1829e390dc"} Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.557662 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:21 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:21 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:21 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.557731 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.570341 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.570389 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.570510 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98266\" (UniqueName: \"kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.648579 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" podStartSLOduration=77.648556047 podStartE2EDuration="1m17.648556047s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:21.64611736 +0000 UTC m=+102.244711844" watchObservedRunningTime="2025-12-03 18:56:21.648556047 +0000 UTC m=+102.247150501" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.678746 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.678889 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98266\" (UniqueName: \"kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.679037 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.679267 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.680177 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.705684 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98266\" (UniqueName: \"kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266\") pod \"redhat-marketplace-dpvhq\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.712006 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.712346 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.718544 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.752448 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.761645 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.761696 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.761942 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.761962 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.773753 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.773805 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.774708 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.786200 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.806720 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:21 crc kubenswrapper[4731]: I1203 18:56:21.867150 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.033662 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.034161 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.036906 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.037894 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.040481 4731 patch_prober.go:28] interesting pod/console-f9d7485db-65n9c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.040516 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-65n9c" podUID="3033d290-d147-4727-8d61-0dabed08e76d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.041675 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.055825 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.088234 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxk7\" (UniqueName: \"kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.088314 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.088464 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.190091 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.190223 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxk7\" (UniqueName: \"kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.190243 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.190726 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.190945 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.204420 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.209320 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxk7\" (UniqueName: \"kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7\") pod \"redhat-operators-mkdzz\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.359633 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.446153 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.447497 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.453405 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.493972 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.494052 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.494092 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjx8\" (UniqueName: \"kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.557615 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.562035 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:22 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:22 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:22 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.562071 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.563919 4731 generic.go:334] "Generic (PLEG): container finished" podID="7311d0b6-0888-4788-974d-6f1e971123eb" containerID="5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b" exitCode=0 Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.563958 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerDied","Data":"5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b"} Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.563978 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerStarted","Data":"ecb21fe6715682029dd1cc78ab8ce544204596d748021142144fd731a83a08eb"} Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.571812 4731 generic.go:334] "Generic (PLEG): container finished" podID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerID="25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e" exitCode=0 Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.572114 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerDied","Data":"25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e"} Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.572189 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerStarted","Data":"421407adbd8c79de1bbfbd45a8c9f7fd901d08a723635e46e2d8dd363bd143ab"} Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.595865 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nrz57" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.597588 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xrbtb" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.598474 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.598580 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.598633 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjx8\" (UniqueName: \"kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.602162 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.608034 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.646223 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjx8\" (UniqueName: \"kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8\") pod \"redhat-operators-wvcfv\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.798976 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.811093 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.822044 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83957f97-f30b-4ea7-8849-c7264d61fd52-metrics-certs\") pod \"network-metrics-daemon-p6zls\" (UID: \"83957f97-f30b-4ea7-8849-c7264d61fd52\") " pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.897868 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p6zls" Dec 03 18:56:22 crc kubenswrapper[4731]: I1203 18:56:22.993844 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.343013 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.422644 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access\") pod \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.422707 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir\") pod \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\" (UID: \"1cf59d42-b5d1-49f8-87a8-8230f17d36b8\") " Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.423050 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1cf59d42-b5d1-49f8-87a8-8230f17d36b8" (UID: "1cf59d42-b5d1-49f8-87a8-8230f17d36b8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.444663 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1cf59d42-b5d1-49f8-87a8-8230f17d36b8" (UID: "1cf59d42-b5d1-49f8-87a8-8230f17d36b8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.490415 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.526042 4731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.526071 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf59d42-b5d1-49f8-87a8-8230f17d36b8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.560016 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:23 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:23 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:23 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.560386 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.588524 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p6zls"] Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.593888 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerStarted","Data":"9e5768c0ee4ae05ecbbdbb01ccf7eea4ca8511253adf2d78faf8415df5209096"} Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.597500 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cf59d42-b5d1-49f8-87a8-8230f17d36b8","Type":"ContainerDied","Data":"cd4ec1801810bc138156204bd2699b5b6f45ee1fa7cc948dae2864c94ee2fdf5"} Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.597533 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd4ec1801810bc138156204bd2699b5b6f45ee1fa7cc948dae2864c94ee2fdf5" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.597597 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 18:56:23 crc kubenswrapper[4731]: I1203 18:56:23.608679 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerStarted","Data":"e1ada5f93f9f588087e0bf42ddde4cd37b50e4437593e905b96faf46ce7d3c9e"} Dec 03 18:56:23 crc kubenswrapper[4731]: W1203 18:56:23.628813 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83957f97_f30b_4ea7_8849_c7264d61fd52.slice/crio-e65fdfdcbfcff65bf585f3ddf1f8b5181134827591f573400437c16273af9e1a WatchSource:0}: Error finding container e65fdfdcbfcff65bf585f3ddf1f8b5181134827591f573400437c16273af9e1a: Status 404 returned error can't find the container with id e65fdfdcbfcff65bf585f3ddf1f8b5181134827591f573400437c16273af9e1a Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.555608 4731 patch_prober.go:28] interesting pod/router-default-5444994796-q4mkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 18:56:24 crc kubenswrapper[4731]: [-]has-synced failed: reason withheld Dec 03 18:56:24 crc kubenswrapper[4731]: [+]process-running ok Dec 03 18:56:24 crc kubenswrapper[4731]: healthz check failed Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.555996 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q4mkw" podUID="acf996a8-0dfa-4d25-9112-5a3c4688cb77" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.649360 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p6zls" event={"ID":"83957f97-f30b-4ea7-8849-c7264d61fd52","Type":"ContainerStarted","Data":"f3aff8956f6faa1653e9ccdf8994b63cb4da2f010224a55872ed9db4d78a10c1"} Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.649449 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p6zls" event={"ID":"83957f97-f30b-4ea7-8849-c7264d61fd52","Type":"ContainerStarted","Data":"e65fdfdcbfcff65bf585f3ddf1f8b5181134827591f573400437c16273af9e1a"} Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.679807 4731 generic.go:334] "Generic (PLEG): container finished" podID="1fb9603e-2925-4558-ac8e-4877220963d5" containerID="d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935" exitCode=0 Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.679885 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerDied","Data":"d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935"} Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.683934 4731 generic.go:334] "Generic (PLEG): container finished" podID="d123dbe9-6202-40c8-99ca-556091b98f96" containerID="263944f0e021e025eb518e9e47e4c6b494000b9004a0fa4c265613038ca95b1c" exitCode=0 Dec 03 18:56:24 crc kubenswrapper[4731]: I1203 18:56:24.683974 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerDied","Data":"263944f0e021e025eb518e9e47e4c6b494000b9004a0fa4c265613038ca95b1c"} Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.558374 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.567177 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-q4mkw" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.709209 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p6zls" event={"ID":"83957f97-f30b-4ea7-8849-c7264d61fd52","Type":"ContainerStarted","Data":"e20da6d97856b8e94402644e81c3c68268dc121c34247ac8a427694d1bcc7b54"} Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.757631 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p6zls" podStartSLOduration=81.757610365 podStartE2EDuration="1m21.757610365s" podCreationTimestamp="2025-12-03 18:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:25.746576918 +0000 UTC m=+106.345171382" watchObservedRunningTime="2025-12-03 18:56:25.757610365 +0000 UTC m=+106.356204829" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.761055 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 18:56:25 crc kubenswrapper[4731]: E1203 18:56:25.761311 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf59d42-b5d1-49f8-87a8-8230f17d36b8" containerName="pruner" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.761323 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf59d42-b5d1-49f8-87a8-8230f17d36b8" containerName="pruner" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.761430 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf59d42-b5d1-49f8-87a8-8230f17d36b8" containerName="pruner" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.761781 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.770502 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.770686 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.773811 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.885193 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.885304 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.986534 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.986692 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:25 crc kubenswrapper[4731]: I1203 18:56:25.987183 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:26 crc kubenswrapper[4731]: I1203 18:56:26.008735 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:26 crc kubenswrapper[4731]: I1203 18:56:26.099596 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:26 crc kubenswrapper[4731]: I1203 18:56:26.649702 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 18:56:26 crc kubenswrapper[4731]: W1203 18:56:26.699455 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc69a38c_e982_4116_b51e_873768fcbc4e.slice/crio-eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e WatchSource:0}: Error finding container eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e: Status 404 returned error can't find the container with id eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e Dec 03 18:56:26 crc kubenswrapper[4731]: I1203 18:56:26.731049 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc69a38c-e982-4116-b51e-873768fcbc4e","Type":"ContainerStarted","Data":"eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e"} Dec 03 18:56:27 crc kubenswrapper[4731]: I1203 18:56:27.725670 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zf9fq" Dec 03 18:56:28 crc kubenswrapper[4731]: I1203 18:56:28.766618 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc69a38c-e982-4116-b51e-873768fcbc4e","Type":"ContainerStarted","Data":"8551b63e67d504ef71fb9ff092a5361ccb7e78166ccb71e2bfba8bb9a12f51cd"} Dec 03 18:56:28 crc kubenswrapper[4731]: I1203 18:56:28.786435 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.786410805 podStartE2EDuration="3.786410805s" podCreationTimestamp="2025-12-03 18:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:56:28.782917339 +0000 UTC m=+109.381511823" watchObservedRunningTime="2025-12-03 18:56:28.786410805 +0000 UTC m=+109.385005269" Dec 03 18:56:29 crc kubenswrapper[4731]: I1203 18:56:29.785565 4731 generic.go:334] "Generic (PLEG): container finished" podID="cc69a38c-e982-4116-b51e-873768fcbc4e" containerID="8551b63e67d504ef71fb9ff092a5361ccb7e78166ccb71e2bfba8bb9a12f51cd" exitCode=0 Dec 03 18:56:29 crc kubenswrapper[4731]: I1203 18:56:29.785614 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc69a38c-e982-4116-b51e-873768fcbc4e","Type":"ContainerDied","Data":"8551b63e67d504ef71fb9ff092a5361ccb7e78166ccb71e2bfba8bb9a12f51cd"} Dec 03 18:56:31 crc kubenswrapper[4731]: I1203 18:56:31.758902 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:31 crc kubenswrapper[4731]: I1203 18:56:31.759206 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:31 crc kubenswrapper[4731]: I1203 18:56:31.759424 4731 patch_prober.go:28] interesting pod/downloads-7954f5f757-2mkdc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 03 18:56:31 crc kubenswrapper[4731]: I1203 18:56:31.759473 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2mkdc" podUID="ae92a1b7-7488-465a-bf52-0cdc4de799f3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 03 18:56:32 crc kubenswrapper[4731]: I1203 18:56:32.294294 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:32 crc kubenswrapper[4731]: I1203 18:56:32.299693 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 18:56:40 crc kubenswrapper[4731]: I1203 18:56:40.484444 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 18:56:41 crc kubenswrapper[4731]: I1203 18:56:41.764018 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2mkdc" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.210443 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.270912 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access\") pod \"cc69a38c-e982-4116-b51e-873768fcbc4e\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.270975 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir\") pod \"cc69a38c-e982-4116-b51e-873768fcbc4e\" (UID: \"cc69a38c-e982-4116-b51e-873768fcbc4e\") " Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.271082 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc69a38c-e982-4116-b51e-873768fcbc4e" (UID: "cc69a38c-e982-4116-b51e-873768fcbc4e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.271427 4731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc69a38c-e982-4116-b51e-873768fcbc4e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.277859 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc69a38c-e982-4116-b51e-873768fcbc4e" (UID: "cc69a38c-e982-4116-b51e-873768fcbc4e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.373068 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc69a38c-e982-4116-b51e-873768fcbc4e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.939776 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cc69a38c-e982-4116-b51e-873768fcbc4e","Type":"ContainerDied","Data":"eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e"} Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.939832 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 18:56:49 crc kubenswrapper[4731]: I1203 18:56:49.939862 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb4adf8ebb4958c27e3fda0a01bc750991644f7c0fe3432b5f1357e33efe926e" Dec 03 18:56:51 crc kubenswrapper[4731]: I1203 18:56:51.985946 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6cmg" Dec 03 18:56:57 crc kubenswrapper[4731]: E1203 18:56:57.824135 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 18:56:57 crc kubenswrapper[4731]: E1203 18:56:57.825310 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l56l7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zkdtc_openshift-marketplace(7e4faf4b-b94a-4903-b53c-9b4fa33b8052): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:56:57 crc kubenswrapper[4731]: E1203 18:56:57.826597 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zkdtc" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" Dec 03 18:56:58 crc kubenswrapper[4731]: E1203 18:56:58.776141 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zkdtc" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" Dec 03 18:56:58 crc kubenswrapper[4731]: E1203 18:56:58.849807 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 18:56:58 crc kubenswrapper[4731]: E1203 18:56:58.850014 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98266,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dpvhq_openshift-marketplace(35ae8418-c2af-4207-99a4-0fc2d9931ec4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:56:58 crc kubenswrapper[4731]: E1203 18:56:58.852066 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dpvhq" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.762429 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 18:56:59 crc kubenswrapper[4731]: E1203 18:56:59.762679 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc69a38c-e982-4116-b51e-873768fcbc4e" containerName="pruner" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.762690 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc69a38c-e982-4116-b51e-873768fcbc4e" containerName="pruner" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.762785 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc69a38c-e982-4116-b51e-873768fcbc4e" containerName="pruner" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.763168 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.765813 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.765998 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.767193 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.843972 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.844121 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.945551 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.945649 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.946112 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:56:59 crc kubenswrapper[4731]: I1203 18:56:59.963940 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:57:00 crc kubenswrapper[4731]: I1203 18:57:00.093598 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.552283 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dpvhq" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.622984 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.623148 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fvrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m9jch_openshift-marketplace(7311d0b6-0888-4788-974d-6f1e971123eb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.624297 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-m9jch" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.654182 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.654385 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qdjx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wvcfv_openshift-marketplace(d123dbe9-6202-40c8-99ca-556091b98f96): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.655898 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wvcfv" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.672355 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.672522 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzxk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mkdzz_openshift-marketplace(1fb9603e-2925-4558-ac8e-4877220963d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.673758 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mkdzz" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.691113 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.691309 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlzjk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cl7ct_openshift-marketplace(468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:02 crc kubenswrapper[4731]: E1203 18:57:02.692552 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cl7ct" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.078190 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cl7ct" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.078375 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m9jch" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.078344 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wvcfv" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.153505 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.153697 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmq9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jz9nw_openshift-marketplace(169fb2cd-829d-4f3e-8a08-33c431d6c3d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.155929 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jz9nw" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.186454 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.187149 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xh9zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r88d7_openshift-marketplace(7420acab-728e-4fa3-b4e0-49db517c4018): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 18:57:04 crc kubenswrapper[4731]: E1203 18:57:04.190142 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r88d7" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" Dec 03 18:57:04 crc kubenswrapper[4731]: I1203 18:57:04.273535 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 18:57:04 crc kubenswrapper[4731]: W1203 18:57:04.283227 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod22ddabcb_cfb5_40c0_887f_faf10cafbe22.slice/crio-5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0 WatchSource:0}: Error finding container 5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0: Status 404 returned error can't find the container with id 5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0 Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.055709 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22ddabcb-cfb5-40c0-887f-faf10cafbe22","Type":"ContainerStarted","Data":"a48f8445823d71ace96c3790b8b1c1f67797bb7274925c651f20931d18361750"} Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.056171 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22ddabcb-cfb5-40c0-887f-faf10cafbe22","Type":"ContainerStarted","Data":"5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0"} Dec 03 18:57:05 crc kubenswrapper[4731]: E1203 18:57:05.057151 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jz9nw" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" Dec 03 18:57:05 crc kubenswrapper[4731]: E1203 18:57:05.057815 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r88d7" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.078057 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.078037694 podStartE2EDuration="6.078037694s" podCreationTimestamp="2025-12-03 18:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:57:05.073088319 +0000 UTC m=+145.671682783" watchObservedRunningTime="2025-12-03 18:57:05.078037694 +0000 UTC m=+145.676632148" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.559156 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.563226 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.567603 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.724030 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.724176 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.724200 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.826106 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.826303 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.826358 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.826459 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.826553 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.847350 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:05 crc kubenswrapper[4731]: I1203 18:57:05.887343 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.063532 4731 generic.go:334] "Generic (PLEG): container finished" podID="22ddabcb-cfb5-40c0-887f-faf10cafbe22" containerID="a48f8445823d71ace96c3790b8b1c1f67797bb7274925c651f20931d18361750" exitCode=0 Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.063642 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22ddabcb-cfb5-40c0-887f-faf10cafbe22","Type":"ContainerDied","Data":"a48f8445823d71ace96c3790b8b1c1f67797bb7274925c651f20931d18361750"} Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.100394 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.849837 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.850299 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.850385 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.850420 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.853026 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.854556 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.855104 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.861722 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.862211 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.869512 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.878792 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.882980 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.971990 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.980283 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:57:06 crc kubenswrapper[4731]: I1203 18:57:06.988843 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.084709 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad209eab-4a27-4c4f-961b-1c6962bf56f0","Type":"ContainerStarted","Data":"d7ad5a480423da43539440422aacdcb38b0dbfbdffb942d12e0c05e17caeff36"} Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.085038 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad209eab-4a27-4c4f-961b-1c6962bf56f0","Type":"ContainerStarted","Data":"f7bbc4f8abad6932496dd5027734f52820f06de7489dd8101552579612be6194"} Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.109009 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.108986215 podStartE2EDuration="2.108986215s" podCreationTimestamp="2025-12-03 18:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:57:07.103461441 +0000 UTC m=+147.702055895" watchObservedRunningTime="2025-12-03 18:57:07.108986215 +0000 UTC m=+147.707580689" Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.354767 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:57:07 crc kubenswrapper[4731]: W1203 18:57:07.419198 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-15868b9752a1baae37b6657773602853c484a6f5a73b6ca0cecff3e6660af56d WatchSource:0}: Error finding container 15868b9752a1baae37b6657773602853c484a6f5a73b6ca0cecff3e6660af56d: Status 404 returned error can't find the container with id 15868b9752a1baae37b6657773602853c484a6f5a73b6ca0cecff3e6660af56d Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.459429 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir\") pod \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.459521 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access\") pod \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\" (UID: \"22ddabcb-cfb5-40c0-887f-faf10cafbe22\") " Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.460180 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22ddabcb-cfb5-40c0-887f-faf10cafbe22" (UID: "22ddabcb-cfb5-40c0-887f-faf10cafbe22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.465064 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22ddabcb-cfb5-40c0-887f-faf10cafbe22" (UID: "22ddabcb-cfb5-40c0-887f-faf10cafbe22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:57:07 crc kubenswrapper[4731]: W1203 18:57:07.514134 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c286083dbb595da24cf2c28f1b7b182f7cb23fc32b1bdae5c6ea107e1b5972f2 WatchSource:0}: Error finding container c286083dbb595da24cf2c28f1b7b182f7cb23fc32b1bdae5c6ea107e1b5972f2: Status 404 returned error can't find the container with id c286083dbb595da24cf2c28f1b7b182f7cb23fc32b1bdae5c6ea107e1b5972f2 Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.561401 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:07 crc kubenswrapper[4731]: I1203 18:57:07.561430 4731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22ddabcb-cfb5-40c0-887f-faf10cafbe22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.090740 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22ddabcb-cfb5-40c0-887f-faf10cafbe22","Type":"ContainerDied","Data":"5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.090780 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.090785 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6d4b2fa4a853160d4116652c12575b39c636bb651f30a401c4713d9164e3e0" Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.092809 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b4f11fea0d0d4387151af864df09563edb41648343c6056f49bbb99de4a2d98b"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.092841 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"15868b9752a1baae37b6657773602853c484a6f5a73b6ca0cecff3e6660af56d"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.095127 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6fa8235e02c4452ac97c181e1cc7085b7351edf116bc4e13084fb8f59711d010"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.095164 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9b45c8056509ffbffd2a14791576d3c4574c01732a92e90b4e5f5031d80c3d93"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.095311 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.096828 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0ed851d0f227e314dd0777a54194c5b1cfe873ac340b3a0f356615571e21631b"} Dec 03 18:57:08 crc kubenswrapper[4731]: I1203 18:57:08.096860 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c286083dbb595da24cf2c28f1b7b182f7cb23fc32b1bdae5c6ea107e1b5972f2"} Dec 03 18:57:12 crc kubenswrapper[4731]: I1203 18:57:12.119448 4731 generic.go:334] "Generic (PLEG): container finished" podID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerID="75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d" exitCode=0 Dec 03 18:57:12 crc kubenswrapper[4731]: I1203 18:57:12.119550 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerDied","Data":"75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d"} Dec 03 18:57:13 crc kubenswrapper[4731]: I1203 18:57:13.126289 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerStarted","Data":"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d"} Dec 03 18:57:13 crc kubenswrapper[4731]: I1203 18:57:13.145587 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zkdtc" podStartSLOduration=2.6546400390000002 podStartE2EDuration="55.145565823s" podCreationTimestamp="2025-12-03 18:56:18 +0000 UTC" firstStartedPulling="2025-12-03 18:56:20.498360586 +0000 UTC m=+101.096955050" lastFinishedPulling="2025-12-03 18:57:12.98928636 +0000 UTC m=+153.587880834" observedRunningTime="2025-12-03 18:57:13.143993241 +0000 UTC m=+153.742587715" watchObservedRunningTime="2025-12-03 18:57:13.145565823 +0000 UTC m=+153.744160277" Dec 03 18:57:15 crc kubenswrapper[4731]: I1203 18:57:15.147674 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerStarted","Data":"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51"} Dec 03 18:57:16 crc kubenswrapper[4731]: I1203 18:57:16.162419 4731 generic.go:334] "Generic (PLEG): container finished" podID="1fb9603e-2925-4558-ac8e-4877220963d5" containerID="46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51" exitCode=0 Dec 03 18:57:16 crc kubenswrapper[4731]: I1203 18:57:16.162526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerDied","Data":"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51"} Dec 03 18:57:17 crc kubenswrapper[4731]: I1203 18:57:17.174368 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerStarted","Data":"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435"} Dec 03 18:57:17 crc kubenswrapper[4731]: I1203 18:57:17.177130 4731 generic.go:334] "Generic (PLEG): container finished" podID="7311d0b6-0888-4788-974d-6f1e971123eb" containerID="21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b" exitCode=0 Dec 03 18:57:17 crc kubenswrapper[4731]: I1203 18:57:17.177202 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerDied","Data":"21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b"} Dec 03 18:57:17 crc kubenswrapper[4731]: I1203 18:57:17.211534 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mkdzz" podStartSLOduration=4.207117068 podStartE2EDuration="55.211509842s" podCreationTimestamp="2025-12-03 18:56:22 +0000 UTC" firstStartedPulling="2025-12-03 18:56:25.714407221 +0000 UTC m=+106.313001685" lastFinishedPulling="2025-12-03 18:57:16.718799995 +0000 UTC m=+157.317394459" observedRunningTime="2025-12-03 18:57:17.194329528 +0000 UTC m=+157.792923992" watchObservedRunningTime="2025-12-03 18:57:17.211509842 +0000 UTC m=+157.810104306" Dec 03 18:57:19 crc kubenswrapper[4731]: I1203 18:57:19.292219 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:57:19 crc kubenswrapper[4731]: I1203 18:57:19.292284 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:57:19 crc kubenswrapper[4731]: I1203 18:57:19.495588 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:57:20 crc kubenswrapper[4731]: I1203 18:57:20.239474 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:57:22 crc kubenswrapper[4731]: I1203 18:57:22.359871 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:57:22 crc kubenswrapper[4731]: I1203 18:57:22.360495 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:57:22 crc kubenswrapper[4731]: I1203 18:57:22.416066 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:57:23 crc kubenswrapper[4731]: I1203 18:57:23.206210 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerStarted","Data":"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75"} Dec 03 18:57:23 crc kubenswrapper[4731]: I1203 18:57:23.226030 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m9jch" podStartSLOduration=2.957916643 podStartE2EDuration="1m2.226010602s" podCreationTimestamp="2025-12-03 18:56:21 +0000 UTC" firstStartedPulling="2025-12-03 18:56:22.573446528 +0000 UTC m=+103.172040992" lastFinishedPulling="2025-12-03 18:57:21.841540487 +0000 UTC m=+162.440134951" observedRunningTime="2025-12-03 18:57:23.223036514 +0000 UTC m=+163.821630978" watchObservedRunningTime="2025-12-03 18:57:23.226010602 +0000 UTC m=+163.824605066" Dec 03 18:57:23 crc kubenswrapper[4731]: I1203 18:57:23.248819 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.229772 4731 generic.go:334] "Generic (PLEG): container finished" podID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerID="858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058" exitCode=0 Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.229878 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerDied","Data":"858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058"} Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.231680 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerStarted","Data":"7401d3306646b37202030f75b341543fc2781547f6a4385803f815f99e76ae62"} Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.234917 4731 generic.go:334] "Generic (PLEG): container finished" podID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerID="5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45" exitCode=0 Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.234979 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerDied","Data":"5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45"} Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.236929 4731 generic.go:334] "Generic (PLEG): container finished" podID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerID="c7c3020233167f3c51a5bdf5245086cff5549b93142040af967aeb1ac6e37b6f" exitCode=0 Dec 03 18:57:25 crc kubenswrapper[4731]: I1203 18:57:25.236963 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerDied","Data":"c7c3020233167f3c51a5bdf5245086cff5549b93142040af967aeb1ac6e37b6f"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.261733 4731 generic.go:334] "Generic (PLEG): container finished" podID="d123dbe9-6202-40c8-99ca-556091b98f96" containerID="7401d3306646b37202030f75b341543fc2781547f6a4385803f815f99e76ae62" exitCode=0 Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.261815 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerDied","Data":"7401d3306646b37202030f75b341543fc2781547f6a4385803f815f99e76ae62"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.267765 4731 generic.go:334] "Generic (PLEG): container finished" podID="7420acab-728e-4fa3-b4e0-49db517c4018" containerID="b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3" exitCode=0 Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.267837 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerDied","Data":"b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.269986 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerStarted","Data":"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.276093 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerStarted","Data":"bc4d35e7c5cbfa689e05ecf8a728f498da5ed7a3042270465bb0ad6774d98038"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.278316 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerStarted","Data":"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945"} Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.299479 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cl7ct" podStartSLOduration=3.133203891 podStartE2EDuration="1m7.299456821s" podCreationTimestamp="2025-12-03 18:56:19 +0000 UTC" firstStartedPulling="2025-12-03 18:56:21.523508031 +0000 UTC m=+102.122102505" lastFinishedPulling="2025-12-03 18:57:25.689760971 +0000 UTC m=+166.288355435" observedRunningTime="2025-12-03 18:57:26.296311386 +0000 UTC m=+166.894905890" watchObservedRunningTime="2025-12-03 18:57:26.299456821 +0000 UTC m=+166.898051305" Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.353815 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dpvhq" podStartSLOduration=2.304119313 podStartE2EDuration="1m5.353794443s" podCreationTimestamp="2025-12-03 18:56:21 +0000 UTC" firstStartedPulling="2025-12-03 18:56:22.592860966 +0000 UTC m=+103.191455430" lastFinishedPulling="2025-12-03 18:57:25.642536096 +0000 UTC m=+166.241130560" observedRunningTime="2025-12-03 18:57:26.338067749 +0000 UTC m=+166.936662213" watchObservedRunningTime="2025-12-03 18:57:26.353794443 +0000 UTC m=+166.952388907" Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.355632 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jz9nw" podStartSLOduration=3.26528804 podStartE2EDuration="1m7.355624595s" podCreationTimestamp="2025-12-03 18:56:19 +0000 UTC" firstStartedPulling="2025-12-03 18:56:21.528579484 +0000 UTC m=+102.127173938" lastFinishedPulling="2025-12-03 18:57:25.618916029 +0000 UTC m=+166.217510493" observedRunningTime="2025-12-03 18:57:26.353132221 +0000 UTC m=+166.951726685" watchObservedRunningTime="2025-12-03 18:57:26.355624595 +0000 UTC m=+166.954219059" Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.468891 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:57:26 crc kubenswrapper[4731]: I1203 18:57:26.468958 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.616044 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.616415 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.661462 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.672180 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.672221 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:57:29 crc kubenswrapper[4731]: I1203 18:57:29.723829 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.307670 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerStarted","Data":"01809bebd633016d51382b26639eaf334961e7b980fe368f4cbddb2b7173e999"} Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.310020 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerStarted","Data":"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8"} Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.334577 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvcfv" podStartSLOduration=3.684900485 podStartE2EDuration="1m8.33455333s" podCreationTimestamp="2025-12-03 18:56:22 +0000 UTC" firstStartedPulling="2025-12-03 18:56:24.702690159 +0000 UTC m=+105.301284623" lastFinishedPulling="2025-12-03 18:57:29.352343004 +0000 UTC m=+169.950937468" observedRunningTime="2025-12-03 18:57:30.330797825 +0000 UTC m=+170.929392299" watchObservedRunningTime="2025-12-03 18:57:30.33455333 +0000 UTC m=+170.933147794" Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.352599 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r88d7" podStartSLOduration=3.554452614 podStartE2EDuration="1m11.352565221s" podCreationTimestamp="2025-12-03 18:56:19 +0000 UTC" firstStartedPulling="2025-12-03 18:56:21.535302145 +0000 UTC m=+102.133896609" lastFinishedPulling="2025-12-03 18:57:29.333414752 +0000 UTC m=+169.932009216" observedRunningTime="2025-12-03 18:57:30.350065398 +0000 UTC m=+170.948659882" watchObservedRunningTime="2025-12-03 18:57:30.352565221 +0000 UTC m=+170.951159685" Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.365862 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:30 crc kubenswrapper[4731]: I1203 18:57:30.368927 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.380619 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.380696 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.431203 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.775014 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.775341 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:31 crc kubenswrapper[4731]: I1203 18:57:31.829114 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.372586 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.773470 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.773698 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cl7ct" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="registry-server" containerID="cri-o://bc4d35e7c5cbfa689e05ecf8a728f498da5ed7a3042270465bb0ad6774d98038" gracePeriod=2 Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.799637 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.799713 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:57:32 crc kubenswrapper[4731]: I1203 18:57:32.827953 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:33 crc kubenswrapper[4731]: I1203 18:57:33.329643 4731 generic.go:334] "Generic (PLEG): container finished" podID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerID="bc4d35e7c5cbfa689e05ecf8a728f498da5ed7a3042270465bb0ad6774d98038" exitCode=0 Dec 03 18:57:33 crc kubenswrapper[4731]: I1203 18:57:33.329724 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerDied","Data":"bc4d35e7c5cbfa689e05ecf8a728f498da5ed7a3042270465bb0ad6774d98038"} Dec 03 18:57:33 crc kubenswrapper[4731]: I1203 18:57:33.838577 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvcfv" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="registry-server" probeResult="failure" output=< Dec 03 18:57:33 crc kubenswrapper[4731]: timeout: failed to connect service ":50051" within 1s Dec 03 18:57:33 crc kubenswrapper[4731]: > Dec 03 18:57:33 crc kubenswrapper[4731]: I1203 18:57:33.949574 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.114386 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities\") pod \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.114585 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content\") pod \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.114683 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlzjk\" (UniqueName: \"kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk\") pod \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\" (UID: \"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3\") " Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.115312 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities" (OuterVolumeSpecName: "utilities") pod "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" (UID: "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.116140 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.121433 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk" (OuterVolumeSpecName: "kube-api-access-jlzjk") pod "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" (UID: "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3"). InnerVolumeSpecName "kube-api-access-jlzjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.175663 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" (UID: "468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.216943 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.216982 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlzjk\" (UniqueName: \"kubernetes.io/projected/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3-kube-api-access-jlzjk\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.336589 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cl7ct" event={"ID":"468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3","Type":"ContainerDied","Data":"b030958786cc6820a51d47e972b51fd1d665ca07e1f7d2920cb669245f7b95c9"} Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.336677 4731 scope.go:117] "RemoveContainer" containerID="bc4d35e7c5cbfa689e05ecf8a728f498da5ed7a3042270465bb0ad6774d98038" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.336631 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cl7ct" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.365651 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.368003 4731 scope.go:117] "RemoveContainer" containerID="c7c3020233167f3c51a5bdf5245086cff5549b93142040af967aeb1ac6e37b6f" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.372357 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cl7ct"] Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.383456 4731 scope.go:117] "RemoveContainer" containerID="247d101e342a5bdb4fd04153a7df7b19cf0d3bbd9a187042e9ccb4a10c9b4a60" Dec 03 18:57:34 crc kubenswrapper[4731]: I1203 18:57:34.886628 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlqm9"] Dec 03 18:57:35 crc kubenswrapper[4731]: I1203 18:57:35.863121 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" path="/var/lib/kubelet/pods/468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3/volumes" Dec 03 18:57:36 crc kubenswrapper[4731]: I1203 18:57:36.180435 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:57:36 crc kubenswrapper[4731]: I1203 18:57:36.181013 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dpvhq" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="registry-server" containerID="cri-o://06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e" gracePeriod=2 Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.131865 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.256844 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities\") pod \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.256958 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content\") pod \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.256995 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98266\" (UniqueName: \"kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266\") pod \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\" (UID: \"35ae8418-c2af-4207-99a4-0fc2d9931ec4\") " Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.258108 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities" (OuterVolumeSpecName: "utilities") pod "35ae8418-c2af-4207-99a4-0fc2d9931ec4" (UID: "35ae8418-c2af-4207-99a4-0fc2d9931ec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.263535 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266" (OuterVolumeSpecName: "kube-api-access-98266") pod "35ae8418-c2af-4207-99a4-0fc2d9931ec4" (UID: "35ae8418-c2af-4207-99a4-0fc2d9931ec4"). InnerVolumeSpecName "kube-api-access-98266". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.280503 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35ae8418-c2af-4207-99a4-0fc2d9931ec4" (UID: "35ae8418-c2af-4207-99a4-0fc2d9931ec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.356951 4731 generic.go:334] "Generic (PLEG): container finished" podID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerID="06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e" exitCode=0 Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.357002 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerDied","Data":"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e"} Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.357036 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvhq" event={"ID":"35ae8418-c2af-4207-99a4-0fc2d9931ec4","Type":"ContainerDied","Data":"421407adbd8c79de1bbfbd45a8c9f7fd901d08a723635e46e2d8dd363bd143ab"} Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.357058 4731 scope.go:117] "RemoveContainer" containerID="06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.357198 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvhq" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.358664 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.358697 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ae8418-c2af-4207-99a4-0fc2d9931ec4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.358712 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98266\" (UniqueName: \"kubernetes.io/projected/35ae8418-c2af-4207-99a4-0fc2d9931ec4-kube-api-access-98266\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.373598 4731 scope.go:117] "RemoveContainer" containerID="5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.385208 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.394041 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvhq"] Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.405111 4731 scope.go:117] "RemoveContainer" containerID="25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.425659 4731 scope.go:117] "RemoveContainer" containerID="06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e" Dec 03 18:57:37 crc kubenswrapper[4731]: E1203 18:57:37.429004 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e\": container with ID starting with 06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e not found: ID does not exist" containerID="06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.429097 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e"} err="failed to get container status \"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e\": rpc error: code = NotFound desc = could not find container \"06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e\": container with ID starting with 06181461c5428af60102197fd538c0f1782c2a6cdf66014d8249755660422a0e not found: ID does not exist" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.429189 4731 scope.go:117] "RemoveContainer" containerID="5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45" Dec 03 18:57:37 crc kubenswrapper[4731]: E1203 18:57:37.429638 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45\": container with ID starting with 5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45 not found: ID does not exist" containerID="5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.429702 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45"} err="failed to get container status \"5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45\": rpc error: code = NotFound desc = could not find container \"5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45\": container with ID starting with 5822ce13a5508f04304c7a8270fc7f6e2eff0fc4741a93a150bc03338f553b45 not found: ID does not exist" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.429737 4731 scope.go:117] "RemoveContainer" containerID="25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e" Dec 03 18:57:37 crc kubenswrapper[4731]: E1203 18:57:37.429976 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e\": container with ID starting with 25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e not found: ID does not exist" containerID="25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.429999 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e"} err="failed to get container status \"25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e\": rpc error: code = NotFound desc = could not find container \"25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e\": container with ID starting with 25576b88625a635aacfc10e9a27f64e1884dd017fadbc0e7f36b8d7902a5c09e not found: ID does not exist" Dec 03 18:57:37 crc kubenswrapper[4731]: I1203 18:57:37.863070 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" path="/var/lib/kubelet/pods/35ae8418-c2af-4207-99a4-0fc2d9931ec4/volumes" Dec 03 18:57:39 crc kubenswrapper[4731]: I1203 18:57:39.809829 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:39 crc kubenswrapper[4731]: I1203 18:57:39.809877 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:39 crc kubenswrapper[4731]: I1203 18:57:39.854750 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:40 crc kubenswrapper[4731]: I1203 18:57:40.415312 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:41 crc kubenswrapper[4731]: I1203 18:57:41.774043 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.406118 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r88d7" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="registry-server" containerID="cri-o://cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8" gracePeriod=2 Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.771401 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.845827 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.905531 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.949566 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh9zs\" (UniqueName: \"kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs\") pod \"7420acab-728e-4fa3-b4e0-49db517c4018\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.949683 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities\") pod \"7420acab-728e-4fa3-b4e0-49db517c4018\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.949712 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content\") pod \"7420acab-728e-4fa3-b4e0-49db517c4018\" (UID: \"7420acab-728e-4fa3-b4e0-49db517c4018\") " Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.951958 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities" (OuterVolumeSpecName: "utilities") pod "7420acab-728e-4fa3-b4e0-49db517c4018" (UID: "7420acab-728e-4fa3-b4e0-49db517c4018"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.961486 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs" (OuterVolumeSpecName: "kube-api-access-xh9zs") pod "7420acab-728e-4fa3-b4e0-49db517c4018" (UID: "7420acab-728e-4fa3-b4e0-49db517c4018"). InnerVolumeSpecName "kube-api-access-xh9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:57:42 crc kubenswrapper[4731]: I1203 18:57:42.998474 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7420acab-728e-4fa3-b4e0-49db517c4018" (UID: "7420acab-728e-4fa3-b4e0-49db517c4018"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.051962 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.052022 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7420acab-728e-4fa3-b4e0-49db517c4018-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.052048 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh9zs\" (UniqueName: \"kubernetes.io/projected/7420acab-728e-4fa3-b4e0-49db517c4018-kube-api-access-xh9zs\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.414942 4731 generic.go:334] "Generic (PLEG): container finished" podID="7420acab-728e-4fa3-b4e0-49db517c4018" containerID="cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8" exitCode=0 Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.415050 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerDied","Data":"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8"} Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.415091 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r88d7" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.415121 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r88d7" event={"ID":"7420acab-728e-4fa3-b4e0-49db517c4018","Type":"ContainerDied","Data":"01967f255f0916532ee66abad481a685e00488ca94ac2086d83d2ef7a04e763b"} Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.415144 4731 scope.go:117] "RemoveContainer" containerID="cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.432140 4731 scope.go:117] "RemoveContainer" containerID="b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.452961 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.455771 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r88d7"] Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.468222 4731 scope.go:117] "RemoveContainer" containerID="5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.483110 4731 scope.go:117] "RemoveContainer" containerID="cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8" Dec 03 18:57:43 crc kubenswrapper[4731]: E1203 18:57:43.483459 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8\": container with ID starting with cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8 not found: ID does not exist" containerID="cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.483496 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8"} err="failed to get container status \"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8\": rpc error: code = NotFound desc = could not find container \"cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8\": container with ID starting with cfb7eb69daa6898eef3a8c6d89fb0c50be62bfc76d8d703feee01aa6344696a8 not found: ID does not exist" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.483520 4731 scope.go:117] "RemoveContainer" containerID="b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3" Dec 03 18:57:43 crc kubenswrapper[4731]: E1203 18:57:43.483770 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3\": container with ID starting with b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3 not found: ID does not exist" containerID="b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.483799 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3"} err="failed to get container status \"b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3\": rpc error: code = NotFound desc = could not find container \"b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3\": container with ID starting with b75175b1b55efe9e1a3fa1b95bff3bb10f39820237a0c9432c7278ef6a4c13f3 not found: ID does not exist" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.483813 4731 scope.go:117] "RemoveContainer" containerID="5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414" Dec 03 18:57:43 crc kubenswrapper[4731]: E1203 18:57:43.484016 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414\": container with ID starting with 5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414 not found: ID does not exist" containerID="5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.484040 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414"} err="failed to get container status \"5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414\": rpc error: code = NotFound desc = could not find container \"5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414\": container with ID starting with 5711a09baa2c34be06737b2dd25c3d35de8b35cb609b6ff24371fc4e5e55a414 not found: ID does not exist" Dec 03 18:57:43 crc kubenswrapper[4731]: I1203 18:57:43.865432 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" path="/var/lib/kubelet/pods/7420acab-728e-4fa3-b4e0-49db517c4018/volumes" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.011613 4731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012028 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012058 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012078 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012090 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012112 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012126 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012141 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012155 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012180 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012192 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="extract-utilities" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012214 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012226 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012243 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012287 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="extract-content" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012308 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012323 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012340 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ddabcb-cfb5-40c0-887f-faf10cafbe22" containerName="pruner" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012351 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ddabcb-cfb5-40c0-887f-faf10cafbe22" containerName="pruner" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.012366 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012378 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012560 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7420acab-728e-4fa3-b4e0-49db517c4018" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012585 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ae8418-c2af-4207-99a4-0fc2d9931ec4" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012612 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ddabcb-cfb5-40c0-887f-faf10cafbe22" containerName="pruner" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.012626 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="468d2bd3-2bf4-499b-a6d0-5eaea5fd23d3" containerName="registry-server" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.013323 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.015543 4731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.015926 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177" gracePeriod=15 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.015986 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e" gracePeriod=15 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.016064 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734" gracePeriod=15 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.016114 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2" gracePeriod=15 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.016156 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425" gracePeriod=15 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.020227 4731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.020579 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.020646 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.020711 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.020852 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.020915 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.020972 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.021039 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021098 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.021153 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021204 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.021331 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021402 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.021470 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021536 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021732 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.021814 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.022045 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.022143 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.023082 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.023198 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.059661 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172336 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172382 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172409 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172428 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172528 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172550 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172628 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.172673 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274190 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274325 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274352 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274379 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274365 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274404 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274428 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274452 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274468 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274472 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274516 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274530 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274531 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274474 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274640 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.274638 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.345980 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:57:44 crc kubenswrapper[4731]: W1203 18:57:44.364195 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-891c948b18e6c6c0dda9ac554bc77738a7ffc924ee2be82b8d8cd6e309bca4f3 WatchSource:0}: Error finding container 891c948b18e6c6c0dda9ac554bc77738a7ffc924ee2be82b8d8cd6e309bca4f3: Status 404 returned error can't find the container with id 891c948b18e6c6c0dda9ac554bc77738a7ffc924ee2be82b8d8cd6e309bca4f3 Dec 03 18:57:44 crc kubenswrapper[4731]: E1203 18:57:44.367120 4731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dc993814ff548 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,LastTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.421632 4731 generic.go:334] "Generic (PLEG): container finished" podID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" containerID="d7ad5a480423da43539440422aacdcb38b0dbfbdffb942d12e0c05e17caeff36" exitCode=0 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.421712 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad209eab-4a27-4c4f-961b-1c6962bf56f0","Type":"ContainerDied","Data":"d7ad5a480423da43539440422aacdcb38b0dbfbdffb942d12e0c05e17caeff36"} Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.422461 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.422715 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.422956 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.425666 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427182 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427851 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e" exitCode=0 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427878 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425" exitCode=0 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427889 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2" exitCode=0 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427899 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734" exitCode=2 Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.427956 4731 scope.go:117] "RemoveContainer" containerID="7cc667e82ddddee8c7accf2b80c4fff4bf91ed9eefd9696fae42ef8a8fc70c1a" Dec 03 18:57:44 crc kubenswrapper[4731]: I1203 18:57:44.434784 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"891c948b18e6c6c0dda9ac554bc77738a7ffc924ee2be82b8d8cd6e309bca4f3"} Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.441611 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e"} Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.443200 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.443461 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.443667 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.446955 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.725214 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.726229 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.726901 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.727142 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:45 crc kubenswrapper[4731]: E1203 18:57:45.851551 4731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dc993814ff548 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,LastTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.894099 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access\") pod \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895007 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir\") pod \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895038 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock\") pod \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\" (UID: \"ad209eab-4a27-4c4f-961b-1c6962bf56f0\") " Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895178 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ad209eab-4a27-4c4f-961b-1c6962bf56f0" (UID: "ad209eab-4a27-4c4f-961b-1c6962bf56f0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895245 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock" (OuterVolumeSpecName: "var-lock") pod "ad209eab-4a27-4c4f-961b-1c6962bf56f0" (UID: "ad209eab-4a27-4c4f-961b-1c6962bf56f0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895512 4731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.895538 4731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ad209eab-4a27-4c4f-961b-1c6962bf56f0-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.899183 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ad209eab-4a27-4c4f-961b-1c6962bf56f0" (UID: "ad209eab-4a27-4c4f-961b-1c6962bf56f0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:57:45 crc kubenswrapper[4731]: I1203 18:57:45.996200 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad209eab-4a27-4c4f-961b-1c6962bf56f0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.383709 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.384713 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.385211 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.385509 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.385943 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.454956 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ad209eab-4a27-4c4f-961b-1c6962bf56f0","Type":"ContainerDied","Data":"f7bbc4f8abad6932496dd5027734f52820f06de7489dd8101552579612be6194"} Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.454998 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.455007 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7bbc4f8abad6932496dd5027734f52820f06de7489dd8101552579612be6194" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.457971 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.458800 4731 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177" exitCode=0 Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.458876 4731 scope.go:117] "RemoveContainer" containerID="1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.458889 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.467324 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.467868 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.468123 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.474997 4731 scope.go:117] "RemoveContainer" containerID="0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.486858 4731 scope.go:117] "RemoveContainer" containerID="9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.499640 4731 scope.go:117] "RemoveContainer" containerID="360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505129 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505247 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505335 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505243 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505287 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505356 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505541 4731 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505560 4731 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.505569 4731 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.515818 4731 scope.go:117] "RemoveContainer" containerID="ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.529536 4731 scope.go:117] "RemoveContainer" containerID="07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.554521 4731 scope.go:117] "RemoveContainer" containerID="1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.555015 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\": container with ID starting with 1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e not found: ID does not exist" containerID="1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555060 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e"} err="failed to get container status \"1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\": rpc error: code = NotFound desc = could not find container \"1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e\": container with ID starting with 1cf0c838c3eedc25597e539081283ac1c9044080acf528757b39bde3e097927e not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555118 4731 scope.go:117] "RemoveContainer" containerID="0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.555483 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\": container with ID starting with 0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425 not found: ID does not exist" containerID="0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555508 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425"} err="failed to get container status \"0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\": rpc error: code = NotFound desc = could not find container \"0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425\": container with ID starting with 0ddf53651661438b9da4c6786067a9e427fadd8dff7f2b0443096151468cc425 not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555524 4731 scope.go:117] "RemoveContainer" containerID="9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.555780 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\": container with ID starting with 9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2 not found: ID does not exist" containerID="9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555804 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2"} err="failed to get container status \"9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\": rpc error: code = NotFound desc = could not find container \"9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2\": container with ID starting with 9cbddc9dc215dae5fd6c78a1b4a49b3d4492d3205214c3c13f1a867b783b4fa2 not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.555819 4731 scope.go:117] "RemoveContainer" containerID="360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.556057 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\": container with ID starting with 360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734 not found: ID does not exist" containerID="360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.556078 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734"} err="failed to get container status \"360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\": rpc error: code = NotFound desc = could not find container \"360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734\": container with ID starting with 360d63c9b32e4fe178628c003cbf3cfd7a26effe36b1937e161782d6f7161734 not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.556116 4731 scope.go:117] "RemoveContainer" containerID="ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.556335 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\": container with ID starting with ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177 not found: ID does not exist" containerID="ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.556354 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177"} err="failed to get container status \"ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\": rpc error: code = NotFound desc = could not find container \"ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177\": container with ID starting with ae1e0a5f34778027db744bd02d2fca83ed46864f789b605131803f05efc8f177 not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.556366 4731 scope.go:117] "RemoveContainer" containerID="07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786" Dec 03 18:57:46 crc kubenswrapper[4731]: E1203 18:57:46.556599 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\": container with ID starting with 07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786 not found: ID does not exist" containerID="07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.556628 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786"} err="failed to get container status \"07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\": rpc error: code = NotFound desc = could not find container \"07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786\": container with ID starting with 07b9bc8e1a50b90bd3269b233ba93f9987a5e3a1988eb9c56d20acede595b786 not found: ID does not exist" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.772653 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.772937 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.773170 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.985163 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.985717 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.986001 4731 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.986324 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:46 crc kubenswrapper[4731]: I1203 18:57:46.986663 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:47 crc kubenswrapper[4731]: I1203 18:57:47.868460 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 18:57:49 crc kubenswrapper[4731]: I1203 18:57:49.860791 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:49 crc kubenswrapper[4731]: I1203 18:57:49.861651 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:49 crc kubenswrapper[4731]: I1203 18:57:49.862038 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.688675 4731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.690453 4731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.690904 4731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.691193 4731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.691558 4731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:51 crc kubenswrapper[4731]: I1203 18:57:51.691604 4731 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.691962 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="200ms" Dec 03 18:57:51 crc kubenswrapper[4731]: E1203 18:57:51.893523 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="400ms" Dec 03 18:57:52 crc kubenswrapper[4731]: E1203 18:57:52.295170 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="800ms" Dec 03 18:57:53 crc kubenswrapper[4731]: E1203 18:57:53.096694 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="1.6s" Dec 03 18:57:54 crc kubenswrapper[4731]: E1203 18:57:54.697419 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="3.2s" Dec 03 18:57:55 crc kubenswrapper[4731]: E1203 18:57:55.852772 4731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dc993814ff548 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,LastTimestamp:2025-12-03 18:57:44.365978952 +0000 UTC m=+184.964573416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 18:57:56 crc kubenswrapper[4731]: I1203 18:57:56.469168 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:57:56 crc kubenswrapper[4731]: I1203 18:57:56.469426 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:57:57 crc kubenswrapper[4731]: E1203 18:57:57.899390 4731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="6.4s" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.545756 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.546123 4731 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0" exitCode=1 Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.546173 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0"} Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.546884 4731 scope.go:117] "RemoveContainer" containerID="fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.547390 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.547987 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.548519 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.549334 4731 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.855569 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.856755 4731 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.857233 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.857776 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.858180 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.873779 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.873808 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:57:58 crc kubenswrapper[4731]: E1203 18:57:58.874194 4731 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:58 crc kubenswrapper[4731]: I1203 18:57:58.874718 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:58 crc kubenswrapper[4731]: W1203 18:57:58.902523 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-4f5d53b5471c999d68111871159aafc0ca1d59ad230bcfb9d7a56fac5040da5d WatchSource:0}: Error finding container 4f5d53b5471c999d68111871159aafc0ca1d59ad230bcfb9d7a56fac5040da5d: Status 404 returned error can't find the container with id 4f5d53b5471c999d68111871159aafc0ca1d59ad230bcfb9d7a56fac5040da5d Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.556513 4731 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4f09fdb9ee93b9a37cfed809922d24b72561a176f11d0361eafc279b0e6e5bc5" exitCode=0 Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.556648 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4f09fdb9ee93b9a37cfed809922d24b72561a176f11d0361eafc279b0e6e5bc5"} Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.557218 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f5d53b5471c999d68111871159aafc0ca1d59ad230bcfb9d7a56fac5040da5d"} Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.557836 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.557884 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.558516 4731 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: E1203 18:57:59.558549 4731 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.559157 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.560448 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.560761 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.564697 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.564790 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fabb2950423734c95ccf275cbcacfc74a1e441984079b9cf80a3236e277a727c"} Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.565645 4731 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.565994 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.566483 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.567083 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.862845 4731 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.863900 4731 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.864339 4731 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.864977 4731 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.865215 4731 status_manager.go:851] "Failed to get status for pod" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.17:6443: connect: connection refused" Dec 03 18:57:59 crc kubenswrapper[4731]: I1203 18:57:59.918899 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerName="oauth-openshift" containerID="cri-o://05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb" gracePeriod=15 Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.324277 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.342995 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343066 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343100 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343131 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343168 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343214 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343300 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343343 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343407 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w6dj\" (UniqueName: \"kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343431 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343469 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343497 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343541 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343564 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error\") pod \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\" (UID: \"3b79ffaf-63b0-4a26-bf1a-654f53537a2b\") " Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343889 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.343935 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.344174 4731 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.344218 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.344247 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.345964 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.347306 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.354709 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.354996 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.355221 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.355450 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.355468 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.355788 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.357137 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj" (OuterVolumeSpecName: "kube-api-access-6w6dj") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "kube-api-access-6w6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.358785 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.361018 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3b79ffaf-63b0-4a26-bf1a-654f53537a2b" (UID: "3b79ffaf-63b0-4a26-bf1a-654f53537a2b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445620 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445656 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445669 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445681 4731 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445695 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445711 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445722 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w6dj\" (UniqueName: \"kubernetes.io/projected/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-kube-api-access-6w6dj\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445732 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445746 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445756 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445765 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.445774 4731 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b79ffaf-63b0-4a26-bf1a-654f53537a2b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.585459 4731 generic.go:334] "Generic (PLEG): container finished" podID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerID="05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb" exitCode=0 Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.585548 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.585575 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" event={"ID":"3b79ffaf-63b0-4a26-bf1a-654f53537a2b","Type":"ContainerDied","Data":"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb"} Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.585629 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wlqm9" event={"ID":"3b79ffaf-63b0-4a26-bf1a-654f53537a2b","Type":"ContainerDied","Data":"a4f6528dae6495c11bc0c1b9db2a22210b469b8b9a3c391fbe41a4f2e3e3756e"} Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.585646 4731 scope.go:117] "RemoveContainer" containerID="05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.590294 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98c094380498cdb7dae2cab5712b79bd59b6f466972980cff36e6ad06381fbf2"} Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.590330 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5da826582c24c1931e5d60328cbb25d9c30a16684d7483093912a18f2fd76df5"} Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.590341 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c20bcdf7872682a93a1b76322810d27a0ef6c042919e029dda85955a28ebbf9b"} Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.640151 4731 scope.go:117] "RemoveContainer" containerID="05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb" Dec 03 18:58:00 crc kubenswrapper[4731]: E1203 18:58:00.640752 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb\": container with ID starting with 05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb not found: ID does not exist" containerID="05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb" Dec 03 18:58:00 crc kubenswrapper[4731]: I1203 18:58:00.640786 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb"} err="failed to get container status \"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb\": rpc error: code = NotFound desc = could not find container \"05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb\": container with ID starting with 05d401240c1c5790a881cc33fbc1566f4f15e513fb6a5200b16475c6c766b6eb not found: ID does not exist" Dec 03 18:58:01 crc kubenswrapper[4731]: I1203 18:58:01.601112 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ef882cdd0ba89bebca0894a1b087bda66c2c0f78fccebf04ed67c5f3ce888c8e"} Dec 03 18:58:01 crc kubenswrapper[4731]: I1203 18:58:01.601412 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:01 crc kubenswrapper[4731]: I1203 18:58:01.601435 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ed53d9cfecd356666ed3ae071496960dd9b5a37d7a7aaed068e3292c0cbdf487"} Dec 03 18:58:01 crc kubenswrapper[4731]: I1203 18:58:01.601460 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:01 crc kubenswrapper[4731]: I1203 18:58:01.601488 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:03 crc kubenswrapper[4731]: I1203 18:58:03.875905 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:03 crc kubenswrapper[4731]: I1203 18:58:03.876844 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:03 crc kubenswrapper[4731]: I1203 18:58:03.884498 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:04 crc kubenswrapper[4731]: I1203 18:58:04.041808 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:58:04 crc kubenswrapper[4731]: I1203 18:58:04.042465 4731 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 18:58:04 crc kubenswrapper[4731]: I1203 18:58:04.042570 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 18:58:04 crc kubenswrapper[4731]: I1203 18:58:04.766268 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:58:06 crc kubenswrapper[4731]: I1203 18:58:06.609824 4731 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:07 crc kubenswrapper[4731]: I1203 18:58:07.637225 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:07 crc kubenswrapper[4731]: I1203 18:58:07.637272 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:07 crc kubenswrapper[4731]: I1203 18:58:07.642453 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:07 crc kubenswrapper[4731]: I1203 18:58:07.644970 4731 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="13b023f9-d738-4ec2-a023-93cbda10282b" Dec 03 18:58:08 crc kubenswrapper[4731]: I1203 18:58:08.642995 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:08 crc kubenswrapper[4731]: I1203 18:58:08.643027 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:09 crc kubenswrapper[4731]: I1203 18:58:09.883361 4731 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="13b023f9-d738-4ec2-a023-93cbda10282b" Dec 03 18:58:14 crc kubenswrapper[4731]: I1203 18:58:14.041924 4731 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 18:58:14 crc kubenswrapper[4731]: I1203 18:58:14.042587 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 18:58:15 crc kubenswrapper[4731]: I1203 18:58:15.741445 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.144238 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.324412 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.399547 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.447973 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.465303 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 18:58:16 crc kubenswrapper[4731]: I1203 18:58:16.539821 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 18:58:17 crc kubenswrapper[4731]: I1203 18:58:17.088123 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 18:58:17 crc kubenswrapper[4731]: I1203 18:58:17.468553 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 18:58:17 crc kubenswrapper[4731]: I1203 18:58:17.699038 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.439566 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.567203 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.670686 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.693429 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.698495 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.803873 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.810400 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 18:58:18 crc kubenswrapper[4731]: I1203 18:58:18.849534 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.081923 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.083976 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.243573 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.458937 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.611687 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.668948 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.750292 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.918765 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 18:58:19 crc kubenswrapper[4731]: I1203 18:58:19.970998 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.065908 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.088537 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.222106 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.240989 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.257030 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.299928 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.323607 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.513740 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.558721 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.569994 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.626903 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.660348 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.691986 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.939775 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 18:58:20 crc kubenswrapper[4731]: I1203 18:58:20.967024 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.048768 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.135826 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.165617 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.402268 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.403680 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.443742 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.464398 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.473078 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.589845 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.593111 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.621408 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.636344 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.648854 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.650111 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.658190 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.820443 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.837284 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.971192 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.975404 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 18:58:21 crc kubenswrapper[4731]: I1203 18:58:21.975972 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.030284 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.033962 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.091319 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.149194 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.203657 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.234440 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.348614 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.498133 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.512387 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.568613 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.596898 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.615925 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.647976 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.662454 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.745775 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.935546 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 18:58:22 crc kubenswrapper[4731]: I1203 18:58:22.984154 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.020896 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.043498 4731 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.046604 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.069042 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.118189 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.156309 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.186579 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.193060 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.211278 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.381080 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.447362 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.473352 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.668789 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.673805 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.692455 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.698730 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.751015 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.849552 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.916050 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 18:58:23 crc kubenswrapper[4731]: I1203 18:58:23.937192 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.018588 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.018594 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.042356 4731 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.042422 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.042482 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.043046 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"fabb2950423734c95ccf275cbcacfc74a1e441984079b9cf80a3236e277a727c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.043182 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://fabb2950423734c95ccf275cbcacfc74a1e441984079b9cf80a3236e277a727c" gracePeriod=30 Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.095599 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.200410 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.228478 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.235182 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.251480 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.259181 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.341056 4731 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.357951 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.442189 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.458919 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.487978 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.571065 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.577879 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.589367 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 18:58:24 crc kubenswrapper[4731]: I1203 18:58:24.664249 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.036050 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.093539 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.181311 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.193711 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.215581 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.229716 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.359702 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.392555 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.417216 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.495365 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.550094 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.572390 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.573533 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.597667 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.641158 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.644507 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.664914 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.755940 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.766452 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.872230 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.881776 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.902329 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.959677 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 18:58:25 crc kubenswrapper[4731]: I1203 18:58:25.961530 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.037480 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.084034 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.086288 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.104525 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.178551 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.244632 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.317010 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.319227 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.412155 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.468867 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.468956 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.469027 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.469886 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.469986 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad" gracePeriod=600 Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.580500 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.609753 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.613609 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.647559 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.736432 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.750788 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad" exitCode=0 Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.750845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad"} Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.772864 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.859387 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 18:58:26 crc kubenswrapper[4731]: I1203 18:58:26.901518 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.057639 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.211870 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.224071 4731 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.224391 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=43.224372227 podStartE2EDuration="43.224372227s" podCreationTimestamp="2025-12-03 18:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:58:06.299773768 +0000 UTC m=+206.898368232" watchObservedRunningTime="2025-12-03 18:58:27.224372227 +0000 UTC m=+227.822966691" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228298 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wlqm9","openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228358 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-c98796499-glglt"] Dec 03 18:58:27 crc kubenswrapper[4731]: E1203 18:58:27.228547 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" containerName="installer" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228566 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" containerName="installer" Dec 03 18:58:27 crc kubenswrapper[4731]: E1203 18:58:27.228580 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerName="oauth-openshift" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228588 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerName="oauth-openshift" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228695 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad209eab-4a27-4c4f-961b-1c6962bf56f0" containerName="installer" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228712 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" containerName="oauth-openshift" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228940 4731 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.228976 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b0056941-56ab-4b8a-a25b-5fb8a83c9fb1" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.229187 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.233043 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.234210 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.234860 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.235023 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.235095 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.235107 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.235345 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239166 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239210 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-router-certs\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239238 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-login\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239273 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239307 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-service-ca\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239358 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-dir\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239377 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9x4\" (UniqueName: \"kubernetes.io/projected/aedd0de0-bc61-416f-96ea-d5b173dfb403-kube-api-access-pk9x4\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239429 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239452 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239468 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239518 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-policies\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239535 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239596 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-session\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.239612 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-error\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.240245 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.240445 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.240503 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.240645 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.240729 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.243738 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.243771 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.251118 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.251572 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.254628 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.257837 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.258172 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.260086 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.260070022 podStartE2EDuration="21.260070022s" podCreationTimestamp="2025-12-03 18:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:58:27.257417673 +0000 UTC m=+227.856012137" watchObservedRunningTime="2025-12-03 18:58:27.260070022 +0000 UTC m=+227.858664486" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340195 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-router-certs\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340267 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-login\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340293 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340316 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-service-ca\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340346 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-dir\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340365 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9x4\" (UniqueName: \"kubernetes.io/projected/aedd0de0-bc61-416f-96ea-d5b173dfb403-kube-api-access-pk9x4\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340390 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340409 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340427 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340445 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-policies\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340460 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340476 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-error\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340493 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-session\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.340519 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.341767 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-policies\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.341935 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aedd0de0-bc61-416f-96ea-d5b173dfb403-audit-dir\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.342043 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.342632 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-service-ca\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.342834 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.346611 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-session\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.347150 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-error\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.347509 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-login\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.348670 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.349069 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.350459 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.351450 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.352620 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aedd0de0-bc61-416f-96ea-d5b173dfb403-v4-0-config-system-router-certs\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.356741 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9x4\" (UniqueName: \"kubernetes.io/projected/aedd0de0-bc61-416f-96ea-d5b173dfb403-kube-api-access-pk9x4\") pod \"oauth-openshift-c98796499-glglt\" (UID: \"aedd0de0-bc61-416f-96ea-d5b173dfb403\") " pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.416199 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.450676 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.558655 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.572119 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.594155 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.602703 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.628923 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.672851 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.702507 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.737946 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.760812 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.763708 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256"} Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.784866 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.799802 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c98796499-glglt"] Dec 03 18:58:27 crc kubenswrapper[4731]: W1203 18:58:27.806135 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedd0de0_bc61_416f_96ea_d5b173dfb403.slice/crio-6fae03e74d85ac90db1da53995e8de0d45c86855d88b0602fac3a91c7a881c5e WatchSource:0}: Error finding container 6fae03e74d85ac90db1da53995e8de0d45c86855d88b0602fac3a91c7a881c5e: Status 404 returned error can't find the container with id 6fae03e74d85ac90db1da53995e8de0d45c86855d88b0602fac3a91c7a881c5e Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.825547 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.846403 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.873715 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b79ffaf-63b0-4a26-bf1a-654f53537a2b" path="/var/lib/kubelet/pods/3b79ffaf-63b0-4a26-bf1a-654f53537a2b/volumes" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.885032 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 18:58:27 crc kubenswrapper[4731]: I1203 18:58:27.897369 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.007937 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.101437 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.157449 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.162597 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.294413 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.389117 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.440580 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.474654 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.592688 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.638290 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.696563 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.697558 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.697987 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.771747 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c98796499-glglt" event={"ID":"aedd0de0-bc61-416f-96ea-d5b173dfb403","Type":"ContainerStarted","Data":"bb1fa896d6999aee0ccd6dcb530e1205f4a0fcb49849a68b9814b709dac06375"} Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.771807 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c98796499-glglt" event={"ID":"aedd0de0-bc61-416f-96ea-d5b173dfb403","Type":"ContainerStarted","Data":"6fae03e74d85ac90db1da53995e8de0d45c86855d88b0602fac3a91c7a881c5e"} Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.826201 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.828664 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 18:58:28 crc kubenswrapper[4731]: I1203 18:58:28.872118 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.029376 4731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.030173 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e" gracePeriod=5 Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.081167 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.093764 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.194143 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.198846 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.331364 4731 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.356113 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.378710 4731 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.392202 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.430335 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.513521 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.562106 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.774134 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.776382 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.782208 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-c98796499-glglt" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.808771 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-c98796499-glglt" podStartSLOduration=55.808707431 podStartE2EDuration="55.808707431s" podCreationTimestamp="2025-12-03 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:58:28.799781867 +0000 UTC m=+229.398376351" watchObservedRunningTime="2025-12-03 18:58:29.808707431 +0000 UTC m=+230.407301915" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.834376 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.835661 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.844415 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.869180 4731 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.891814 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 18:58:29 crc kubenswrapper[4731]: I1203 18:58:29.900779 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.131168 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.171903 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.181608 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.349900 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.670275 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 18:58:30 crc kubenswrapper[4731]: I1203 18:58:30.806228 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.476093 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.590004 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.590359 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.732818 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.776365 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.886693 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 18:58:31 crc kubenswrapper[4731]: I1203 18:58:31.963095 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 18:58:32 crc kubenswrapper[4731]: I1203 18:58:32.233094 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 18:58:32 crc kubenswrapper[4731]: I1203 18:58:32.626033 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 18:58:32 crc kubenswrapper[4731]: I1203 18:58:32.775296 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 18:58:33 crc kubenswrapper[4731]: I1203 18:58:33.271486 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.611443 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.612030 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.738846 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.738940 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739082 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739128 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739157 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739070 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739126 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739442 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739538 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739795 4731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739844 4731 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739871 4731 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.739897 4731 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.751602 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.805913 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.805967 4731 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e" exitCode=137 Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.806014 4731 scope.go:117] "RemoveContainer" containerID="ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.806617 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.826730 4731 scope.go:117] "RemoveContainer" containerID="ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e" Dec 03 18:58:34 crc kubenswrapper[4731]: E1203 18:58:34.827421 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e\": container with ID starting with ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e not found: ID does not exist" containerID="ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.827463 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e"} err="failed to get container status \"ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e\": rpc error: code = NotFound desc = could not find container \"ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e\": container with ID starting with ac1ffa22660669b520ca25c1ae4f15d8cc4f50038f1cfd374e5e33ac03f8c27e not found: ID does not exist" Dec 03 18:58:34 crc kubenswrapper[4731]: I1203 18:58:34.841913 4731 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.862567 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.863351 4731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.873428 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.873472 4731 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="585c4c88-3126-492f-afe5-d9c2f56e6f65" Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.876624 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 18:58:35 crc kubenswrapper[4731]: I1203 18:58:35.876677 4731 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="585c4c88-3126-492f-afe5-d9c2f56e6f65" Dec 03 18:58:45 crc kubenswrapper[4731]: I1203 18:58:45.278795 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 18:58:48 crc kubenswrapper[4731]: I1203 18:58:48.336596 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 18:58:49 crc kubenswrapper[4731]: I1203 18:58:49.358457 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 18:58:49 crc kubenswrapper[4731]: I1203 18:58:49.887570 4731 generic.go:334] "Generic (PLEG): container finished" podID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerID="3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5" exitCode=0 Dec 03 18:58:49 crc kubenswrapper[4731]: I1203 18:58:49.887620 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerDied","Data":"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5"} Dec 03 18:58:49 crc kubenswrapper[4731]: I1203 18:58:49.888145 4731 scope.go:117] "RemoveContainer" containerID="3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5" Dec 03 18:58:50 crc kubenswrapper[4731]: I1203 18:58:50.895926 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerStarted","Data":"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322"} Dec 03 18:58:50 crc kubenswrapper[4731]: I1203 18:58:50.896581 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:58:50 crc kubenswrapper[4731]: I1203 18:58:50.897899 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:58:52 crc kubenswrapper[4731]: I1203 18:58:52.994763 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.919100 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.921137 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.921190 4731 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fabb2950423734c95ccf275cbcacfc74a1e441984079b9cf80a3236e277a727c" exitCode=137 Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.921220 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fabb2950423734c95ccf275cbcacfc74a1e441984079b9cf80a3236e277a727c"} Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.921279 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"16d22d2333487554ad7962379f34d612a950af3cc081aadfa63c5ccebbb63075"} Dec 03 18:58:54 crc kubenswrapper[4731]: I1203 18:58:54.921315 4731 scope.go:117] "RemoveContainer" containerID="fdd780daee4188a58d9e8af7bcfa886124b7ee57b1f860fcd0cff900a1eae0a0" Dec 03 18:58:55 crc kubenswrapper[4731]: I1203 18:58:55.931277 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 18:58:57 crc kubenswrapper[4731]: I1203 18:58:57.534275 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 18:58:58 crc kubenswrapper[4731]: I1203 18:58:58.649008 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:58:58 crc kubenswrapper[4731]: I1203 18:58:58.649554 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvcfv" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="registry-server" containerID="cri-o://01809bebd633016d51382b26639eaf334961e7b980fe368f4cbddb2b7173e999" gracePeriod=2 Dec 03 18:58:58 crc kubenswrapper[4731]: I1203 18:58:58.959703 4731 generic.go:334] "Generic (PLEG): container finished" podID="d123dbe9-6202-40c8-99ca-556091b98f96" containerID="01809bebd633016d51382b26639eaf334961e7b980fe368f4cbddb2b7173e999" exitCode=0 Dec 03 18:58:58 crc kubenswrapper[4731]: I1203 18:58:58.959750 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerDied","Data":"01809bebd633016d51382b26639eaf334961e7b980fe368f4cbddb2b7173e999"} Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.025115 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.154824 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdjx8\" (UniqueName: \"kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8\") pod \"d123dbe9-6202-40c8-99ca-556091b98f96\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.154872 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities\") pod \"d123dbe9-6202-40c8-99ca-556091b98f96\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.154940 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content\") pod \"d123dbe9-6202-40c8-99ca-556091b98f96\" (UID: \"d123dbe9-6202-40c8-99ca-556091b98f96\") " Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.155989 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities" (OuterVolumeSpecName: "utilities") pod "d123dbe9-6202-40c8-99ca-556091b98f96" (UID: "d123dbe9-6202-40c8-99ca-556091b98f96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.160608 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8" (OuterVolumeSpecName: "kube-api-access-qdjx8") pod "d123dbe9-6202-40c8-99ca-556091b98f96" (UID: "d123dbe9-6202-40c8-99ca-556091b98f96"). InnerVolumeSpecName "kube-api-access-qdjx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.256039 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdjx8\" (UniqueName: \"kubernetes.io/projected/d123dbe9-6202-40c8-99ca-556091b98f96-kube-api-access-qdjx8\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.256075 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.275515 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d123dbe9-6202-40c8-99ca-556091b98f96" (UID: "d123dbe9-6202-40c8-99ca-556091b98f96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.357898 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d123dbe9-6202-40c8-99ca-556091b98f96-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.464405 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.783025 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.967084 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcfv" event={"ID":"d123dbe9-6202-40c8-99ca-556091b98f96","Type":"ContainerDied","Data":"e1ada5f93f9f588087e0bf42ddde4cd37b50e4437593e905b96faf46ce7d3c9e"} Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.967149 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcfv" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.967151 4731 scope.go:117] "RemoveContainer" containerID="01809bebd633016d51382b26639eaf334961e7b980fe368f4cbddb2b7173e999" Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.985908 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:58:59 crc kubenswrapper[4731]: I1203 18:58:59.988216 4731 scope.go:117] "RemoveContainer" containerID="7401d3306646b37202030f75b341543fc2781547f6a4385803f815f99e76ae62" Dec 03 18:59:00 crc kubenswrapper[4731]: I1203 18:59:00.001746 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvcfv"] Dec 03 18:59:00 crc kubenswrapper[4731]: I1203 18:59:00.010770 4731 scope.go:117] "RemoveContainer" containerID="263944f0e021e025eb518e9e47e4c6b494000b9004a0fa4c265613038ca95b1c" Dec 03 18:59:01 crc kubenswrapper[4731]: I1203 18:59:01.281710 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 18:59:01 crc kubenswrapper[4731]: I1203 18:59:01.863023 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" path="/var/lib/kubelet/pods/d123dbe9-6202-40c8-99ca-556091b98f96/volumes" Dec 03 18:59:04 crc kubenswrapper[4731]: I1203 18:59:04.042667 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:59:04 crc kubenswrapper[4731]: I1203 18:59:04.050106 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:59:04 crc kubenswrapper[4731]: I1203 18:59:04.767048 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:59:04 crc kubenswrapper[4731]: I1203 18:59:04.772735 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.252898 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.253600 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerName="controller-manager" containerID="cri-o://246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a" gracePeriod=30 Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.356901 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.357420 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" podUID="9ddaec36-815c-4929-92dc-85e40f218be1" containerName="route-controller-manager" containerID="cri-o://7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795" gracePeriod=30 Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.683408 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.737206 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790721 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert\") pod \"9ddaec36-815c-4929-92dc-85e40f218be1\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790804 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca\") pod \"9ddaec36-815c-4929-92dc-85e40f218be1\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790850 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles\") pod \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790900 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrlc\" (UniqueName: \"kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc\") pod \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790959 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca\") pod \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.790987 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config\") pod \"9ddaec36-815c-4929-92dc-85e40f218be1\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.791005 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config\") pod \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.791028 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert\") pod \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\" (UID: \"2528f79c-1409-4a38-9fef-a5b56cec0d3c\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.791057 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrcpd\" (UniqueName: \"kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd\") pod \"9ddaec36-815c-4929-92dc-85e40f218be1\" (UID: \"9ddaec36-815c-4929-92dc-85e40f218be1\") " Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.792036 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2528f79c-1409-4a38-9fef-a5b56cec0d3c" (UID: "2528f79c-1409-4a38-9fef-a5b56cec0d3c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.792193 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca" (OuterVolumeSpecName: "client-ca") pod "2528f79c-1409-4a38-9fef-a5b56cec0d3c" (UID: "2528f79c-1409-4a38-9fef-a5b56cec0d3c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.792339 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config" (OuterVolumeSpecName: "config") pod "2528f79c-1409-4a38-9fef-a5b56cec0d3c" (UID: "2528f79c-1409-4a38-9fef-a5b56cec0d3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.792617 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca" (OuterVolumeSpecName: "client-ca") pod "9ddaec36-815c-4929-92dc-85e40f218be1" (UID: "9ddaec36-815c-4929-92dc-85e40f218be1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.792755 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config" (OuterVolumeSpecName: "config") pod "9ddaec36-815c-4929-92dc-85e40f218be1" (UID: "9ddaec36-815c-4929-92dc-85e40f218be1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.796738 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd" (OuterVolumeSpecName: "kube-api-access-zrcpd") pod "9ddaec36-815c-4929-92dc-85e40f218be1" (UID: "9ddaec36-815c-4929-92dc-85e40f218be1"). InnerVolumeSpecName "kube-api-access-zrcpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.797099 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2528f79c-1409-4a38-9fef-a5b56cec0d3c" (UID: "2528f79c-1409-4a38-9fef-a5b56cec0d3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.797160 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc" (OuterVolumeSpecName: "kube-api-access-lbrlc") pod "2528f79c-1409-4a38-9fef-a5b56cec0d3c" (UID: "2528f79c-1409-4a38-9fef-a5b56cec0d3c"). InnerVolumeSpecName "kube-api-access-lbrlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.797685 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9ddaec36-815c-4929-92dc-85e40f218be1" (UID: "9ddaec36-815c-4929-92dc-85e40f218be1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892572 4731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892609 4731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892620 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrlc\" (UniqueName: \"kubernetes.io/projected/2528f79c-1409-4a38-9fef-a5b56cec0d3c-kube-api-access-lbrlc\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892629 4731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892638 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ddaec36-815c-4929-92dc-85e40f218be1-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892648 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2528f79c-1409-4a38-9fef-a5b56cec0d3c-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892656 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2528f79c-1409-4a38-9fef-a5b56cec0d3c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892665 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrcpd\" (UniqueName: \"kubernetes.io/projected/9ddaec36-815c-4929-92dc-85e40f218be1-kube-api-access-zrcpd\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:19 crc kubenswrapper[4731]: I1203 18:59:19.892673 4731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ddaec36-815c-4929-92dc-85e40f218be1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.081704 4731 generic.go:334] "Generic (PLEG): container finished" podID="9ddaec36-815c-4929-92dc-85e40f218be1" containerID="7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795" exitCode=0 Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.081781 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" event={"ID":"9ddaec36-815c-4929-92dc-85e40f218be1","Type":"ContainerDied","Data":"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795"} Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.081816 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" event={"ID":"9ddaec36-815c-4929-92dc-85e40f218be1","Type":"ContainerDied","Data":"6eaf2e87be2b84ba8f06139c0930b9edb39f89e1a3e801f1cd34eb563d5a6198"} Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.081839 4731 scope.go:117] "RemoveContainer" containerID="7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.081964 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.085556 4731 generic.go:334] "Generic (PLEG): container finished" podID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerID="246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a" exitCode=0 Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.085592 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" event={"ID":"2528f79c-1409-4a38-9fef-a5b56cec0d3c","Type":"ContainerDied","Data":"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a"} Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.085616 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" event={"ID":"2528f79c-1409-4a38-9fef-a5b56cec0d3c","Type":"ContainerDied","Data":"3859de746066143e9cb9614044ad167d156c0137771a2d874c174769c29c05ba"} Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.085636 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q7prx" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.101833 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.101900 4731 scope.go:117] "RemoveContainer" containerID="7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.102705 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795\": container with ID starting with 7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795 not found: ID does not exist" containerID="7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.102750 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795"} err="failed to get container status \"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795\": rpc error: code = NotFound desc = could not find container \"7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795\": container with ID starting with 7dfbd948ff31cf5afc9a49cc82e7c35deb7183807a152b93faa49c3703a92795 not found: ID does not exist" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.102775 4731 scope.go:117] "RemoveContainer" containerID="246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.109808 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2z65s"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.119879 4731 scope.go:117] "RemoveContainer" containerID="246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.121378 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a\": container with ID starting with 246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a not found: ID does not exist" containerID="246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.121434 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a"} err="failed to get container status \"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a\": rpc error: code = NotFound desc = could not find container \"246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a\": container with ID starting with 246dffb79d397c1911be92a0474eb637ca968a52db44ec4e50d511b8a8f92d8a not found: ID does not exist" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.130571 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.141557 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q7prx"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370275 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56746659c9-rgwhq"] Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370475 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerName="controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370488 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerName="controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370500 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370506 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370515 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ddaec36-815c-4929-92dc-85e40f218be1" containerName="route-controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370521 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ddaec36-815c-4929-92dc-85e40f218be1" containerName="route-controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370530 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="registry-server" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370536 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="registry-server" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370547 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="extract-content" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370552 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="extract-content" Dec 03 18:59:20 crc kubenswrapper[4731]: E1203 18:59:20.370563 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="extract-utilities" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370568 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="extract-utilities" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370688 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d123dbe9-6202-40c8-99ca-556091b98f96" containerName="registry-server" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370702 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" containerName="controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370711 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.370717 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ddaec36-815c-4929-92dc-85e40f218be1" containerName="route-controller-manager" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.371128 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.374105 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.379459 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.379463 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.379951 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.385210 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.385371 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.392111 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397591 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pkb\" (UniqueName: \"kubernetes.io/projected/29593d55-0927-4ae1-84c5-cb601d8c9acb-kube-api-access-n2pkb\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397683 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-config\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397722 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29593d55-0927-4ae1-84c5-cb601d8c9acb-serving-cert\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397841 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-proxy-ca-bundles\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397944 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-client-ca\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.397961 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56746659c9-rgwhq"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.428404 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.429046 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.431054 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.431440 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.431610 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.431730 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.431871 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.432055 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.438499 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh"] Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.498941 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-client-ca\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499003 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-config\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499031 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pkb\" (UniqueName: \"kubernetes.io/projected/29593d55-0927-4ae1-84c5-cb601d8c9acb-kube-api-access-n2pkb\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499051 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-config\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499072 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29593d55-0927-4ae1-84c5-cb601d8c9acb-serving-cert\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499139 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-client-ca\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499271 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-proxy-ca-bundles\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499319 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8vx\" (UniqueName: \"kubernetes.io/projected/936b97b7-40d3-43f8-ae93-4d7265679677-kube-api-access-sp8vx\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.499361 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936b97b7-40d3-43f8-ae93-4d7265679677-serving-cert\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.500065 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-client-ca\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.500441 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-config\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.500751 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29593d55-0927-4ae1-84c5-cb601d8c9acb-proxy-ca-bundles\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.516433 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29593d55-0927-4ae1-84c5-cb601d8c9acb-serving-cert\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.517169 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pkb\" (UniqueName: \"kubernetes.io/projected/29593d55-0927-4ae1-84c5-cb601d8c9acb-kube-api-access-n2pkb\") pod \"controller-manager-56746659c9-rgwhq\" (UID: \"29593d55-0927-4ae1-84c5-cb601d8c9acb\") " pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.600608 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8vx\" (UniqueName: \"kubernetes.io/projected/936b97b7-40d3-43f8-ae93-4d7265679677-kube-api-access-sp8vx\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.600678 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936b97b7-40d3-43f8-ae93-4d7265679677-serving-cert\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.600755 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-config\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.600801 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-client-ca\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.602100 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-client-ca\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.602268 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/936b97b7-40d3-43f8-ae93-4d7265679677-config\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.615942 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936b97b7-40d3-43f8-ae93-4d7265679677-serving-cert\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.624173 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8vx\" (UniqueName: \"kubernetes.io/projected/936b97b7-40d3-43f8-ae93-4d7265679677-kube-api-access-sp8vx\") pod \"route-controller-manager-564b64f585-cxhhh\" (UID: \"936b97b7-40d3-43f8-ae93-4d7265679677\") " pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.709824 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.744490 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:20 crc kubenswrapper[4731]: I1203 18:59:20.984065 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh"] Dec 03 18:59:21 crc kubenswrapper[4731]: I1203 18:59:21.092441 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" event={"ID":"936b97b7-40d3-43f8-ae93-4d7265679677","Type":"ContainerStarted","Data":"554dd4d67577175458c38ec49d7ad511d475300c1ce277305a93adb3be2368a0"} Dec 03 18:59:21 crc kubenswrapper[4731]: I1203 18:59:21.140673 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56746659c9-rgwhq"] Dec 03 18:59:21 crc kubenswrapper[4731]: W1203 18:59:21.149012 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29593d55_0927_4ae1_84c5_cb601d8c9acb.slice/crio-b3638a9f0a02d6494bd0b44853dfc54b7948224023e20e26737a7e404ef8b254 WatchSource:0}: Error finding container b3638a9f0a02d6494bd0b44853dfc54b7948224023e20e26737a7e404ef8b254: Status 404 returned error can't find the container with id b3638a9f0a02d6494bd0b44853dfc54b7948224023e20e26737a7e404ef8b254 Dec 03 18:59:21 crc kubenswrapper[4731]: I1203 18:59:21.861617 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2528f79c-1409-4a38-9fef-a5b56cec0d3c" path="/var/lib/kubelet/pods/2528f79c-1409-4a38-9fef-a5b56cec0d3c/volumes" Dec 03 18:59:21 crc kubenswrapper[4731]: I1203 18:59:21.863942 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ddaec36-815c-4929-92dc-85e40f218be1" path="/var/lib/kubelet/pods/9ddaec36-815c-4929-92dc-85e40f218be1/volumes" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.101461 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" event={"ID":"936b97b7-40d3-43f8-ae93-4d7265679677","Type":"ContainerStarted","Data":"eb1ecd79bd449666caee521d8e1440532da6c4a8d432f403c8795d13233f62ca"} Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.102185 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.102892 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" event={"ID":"29593d55-0927-4ae1-84c5-cb601d8c9acb","Type":"ContainerStarted","Data":"c77464890129edd53457df5c59718d22580531e0e5ef6b6fe88d183251131b23"} Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.102917 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" event={"ID":"29593d55-0927-4ae1-84c5-cb601d8c9acb","Type":"ContainerStarted","Data":"b3638a9f0a02d6494bd0b44853dfc54b7948224023e20e26737a7e404ef8b254"} Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.104969 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.110035 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.113168 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.122789 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-564b64f585-cxhhh" podStartSLOduration=2.122764766 podStartE2EDuration="2.122764766s" podCreationTimestamp="2025-12-03 18:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:59:22.121275527 +0000 UTC m=+282.719869981" watchObservedRunningTime="2025-12-03 18:59:22.122764766 +0000 UTC m=+282.721359230" Dec 03 18:59:22 crc kubenswrapper[4731]: I1203 18:59:22.158835 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56746659c9-rgwhq" podStartSLOduration=3.1588078729999998 podStartE2EDuration="3.158807873s" podCreationTimestamp="2025-12-03 18:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:59:22.144408647 +0000 UTC m=+282.743003111" watchObservedRunningTime="2025-12-03 18:59:22.158807873 +0000 UTC m=+282.757402327" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.008494 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.009204 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zkdtc" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="registry-server" containerID="cri-o://5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d" gracePeriod=30 Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.018957 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.019290 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jz9nw" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="registry-server" containerID="cri-o://187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945" gracePeriod=30 Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.031675 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.031943 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" containerID="cri-o://472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322" gracePeriod=30 Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.039905 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.040214 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m9jch" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="registry-server" containerID="cri-o://0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75" gracePeriod=30 Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.046957 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.047213 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mkdzz" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="registry-server" containerID="cri-o://d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435" gracePeriod=30 Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.063836 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzsv6"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.066667 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.069883 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzsv6"] Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.149018 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.149385 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.149565 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz5f2\" (UniqueName: \"kubernetes.io/projected/58c5709d-2320-4da3-a897-bf4289ed68ee-kube-api-access-tz5f2\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.251407 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.251490 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.251563 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz5f2\" (UniqueName: \"kubernetes.io/projected/58c5709d-2320-4da3-a897-bf4289ed68ee-kube-api-access-tz5f2\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.253008 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.259016 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58c5709d-2320-4da3-a897-bf4289ed68ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.269649 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz5f2\" (UniqueName: \"kubernetes.io/projected/58c5709d-2320-4da3-a897-bf4289ed68ee-kube-api-access-tz5f2\") pod \"marketplace-operator-79b997595-bzsv6\" (UID: \"58c5709d-2320-4da3-a897-bf4289ed68ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.390995 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.479892 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.515951 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.530355 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.542752 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560750 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities\") pod \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560813 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content\") pod \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560847 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content\") pod \"7311d0b6-0888-4788-974d-6f1e971123eb\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560919 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities\") pod \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560958 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvrw\" (UniqueName: \"kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw\") pod \"7311d0b6-0888-4788-974d-6f1e971123eb\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.560996 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content\") pod \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.561027 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmq9j\" (UniqueName: \"kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j\") pod \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\" (UID: \"169fb2cd-829d-4f3e-8a08-33c431d6c3d1\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.561067 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities\") pod \"7311d0b6-0888-4788-974d-6f1e971123eb\" (UID: \"7311d0b6-0888-4788-974d-6f1e971123eb\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.561101 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l56l7\" (UniqueName: \"kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7\") pod \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\" (UID: \"7e4faf4b-b94a-4903-b53c-9b4fa33b8052\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.561870 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities" (OuterVolumeSpecName: "utilities") pod "169fb2cd-829d-4f3e-8a08-33c431d6c3d1" (UID: "169fb2cd-829d-4f3e-8a08-33c431d6c3d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.569590 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7" (OuterVolumeSpecName: "kube-api-access-l56l7") pod "7e4faf4b-b94a-4903-b53c-9b4fa33b8052" (UID: "7e4faf4b-b94a-4903-b53c-9b4fa33b8052"). InnerVolumeSpecName "kube-api-access-l56l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.576979 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities" (OuterVolumeSpecName: "utilities") pod "7e4faf4b-b94a-4903-b53c-9b4fa33b8052" (UID: "7e4faf4b-b94a-4903-b53c-9b4fa33b8052"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.579401 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities" (OuterVolumeSpecName: "utilities") pod "7311d0b6-0888-4788-974d-6f1e971123eb" (UID: "7311d0b6-0888-4788-974d-6f1e971123eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.581679 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw" (OuterVolumeSpecName: "kube-api-access-9fvrw") pod "7311d0b6-0888-4788-974d-6f1e971123eb" (UID: "7311d0b6-0888-4788-974d-6f1e971123eb"). InnerVolumeSpecName "kube-api-access-9fvrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.592246 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j" (OuterVolumeSpecName: "kube-api-access-dmq9j") pod "169fb2cd-829d-4f3e-8a08-33c431d6c3d1" (UID: "169fb2cd-829d-4f3e-8a08-33c431d6c3d1"). InnerVolumeSpecName "kube-api-access-dmq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.603977 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.614228 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7311d0b6-0888-4788-974d-6f1e971123eb" (UID: "7311d0b6-0888-4788-974d-6f1e971123eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.648806 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "169fb2cd-829d-4f3e-8a08-33c431d6c3d1" (UID: "169fb2cd-829d-4f3e-8a08-33c431d6c3d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.658293 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e4faf4b-b94a-4903-b53c-9b4fa33b8052" (UID: "7e4faf4b-b94a-4903-b53c-9b4fa33b8052"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662049 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities\") pod \"1fb9603e-2925-4558-ac8e-4877220963d5\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662104 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics\") pod \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662182 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content\") pod \"1fb9603e-2925-4558-ac8e-4877220963d5\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662205 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxk7\" (UniqueName: \"kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7\") pod \"1fb9603e-2925-4558-ac8e-4877220963d5\" (UID: \"1fb9603e-2925-4558-ac8e-4877220963d5\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662226 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scw6f\" (UniqueName: \"kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f\") pod \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662277 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca\") pod \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\" (UID: \"8788b686-0b60-4ad3-9e34-16f6fb03c2d0\") " Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662520 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662545 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l56l7\" (UniqueName: \"kubernetes.io/projected/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-kube-api-access-l56l7\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662555 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662566 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662575 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7311d0b6-0888-4788-974d-6f1e971123eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662583 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662594 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvrw\" (UniqueName: \"kubernetes.io/projected/7311d0b6-0888-4788-974d-6f1e971123eb-kube-api-access-9fvrw\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662602 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e4faf4b-b94a-4903-b53c-9b4fa33b8052-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662612 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmq9j\" (UniqueName: \"kubernetes.io/projected/169fb2cd-829d-4f3e-8a08-33c431d6c3d1-kube-api-access-dmq9j\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.662805 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities" (OuterVolumeSpecName: "utilities") pod "1fb9603e-2925-4558-ac8e-4877220963d5" (UID: "1fb9603e-2925-4558-ac8e-4877220963d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.664185 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8788b686-0b60-4ad3-9e34-16f6fb03c2d0" (UID: "8788b686-0b60-4ad3-9e34-16f6fb03c2d0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.666322 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f" (OuterVolumeSpecName: "kube-api-access-scw6f") pod "8788b686-0b60-4ad3-9e34-16f6fb03c2d0" (UID: "8788b686-0b60-4ad3-9e34-16f6fb03c2d0"). InnerVolumeSpecName "kube-api-access-scw6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.669802 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8788b686-0b60-4ad3-9e34-16f6fb03c2d0" (UID: "8788b686-0b60-4ad3-9e34-16f6fb03c2d0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.669978 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7" (OuterVolumeSpecName: "kube-api-access-rzxk7") pod "1fb9603e-2925-4558-ac8e-4877220963d5" (UID: "1fb9603e-2925-4558-ac8e-4877220963d5"). InnerVolumeSpecName "kube-api-access-rzxk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.763523 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.763565 4731 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.763582 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxk7\" (UniqueName: \"kubernetes.io/projected/1fb9603e-2925-4558-ac8e-4877220963d5-kube-api-access-rzxk7\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.763596 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scw6f\" (UniqueName: \"kubernetes.io/projected/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-kube-api-access-scw6f\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.763608 4731 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8788b686-0b60-4ad3-9e34-16f6fb03c2d0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.778548 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fb9603e-2925-4558-ac8e-4877220963d5" (UID: "1fb9603e-2925-4558-ac8e-4877220963d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.872378 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb9603e-2925-4558-ac8e-4877220963d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:59:23 crc kubenswrapper[4731]: I1203 18:59:23.893472 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bzsv6"] Dec 03 18:59:23 crc kubenswrapper[4731]: W1203 18:59:23.899427 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c5709d_2320_4da3_a897_bf4289ed68ee.slice/crio-3ce1f326504f4caaf2e59164f819f5073ec7874db58c6651504a8df6458448ca WatchSource:0}: Error finding container 3ce1f326504f4caaf2e59164f819f5073ec7874db58c6651504a8df6458448ca: Status 404 returned error can't find the container with id 3ce1f326504f4caaf2e59164f819f5073ec7874db58c6651504a8df6458448ca Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.114395 4731 generic.go:334] "Generic (PLEG): container finished" podID="7311d0b6-0888-4788-974d-6f1e971123eb" containerID="0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75" exitCode=0 Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.114605 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerDied","Data":"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.114731 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9jch" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.114762 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9jch" event={"ID":"7311d0b6-0888-4788-974d-6f1e971123eb","Type":"ContainerDied","Data":"ecb21fe6715682029dd1cc78ab8ce544204596d748021142144fd731a83a08eb"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.114787 4731 scope.go:117] "RemoveContainer" containerID="0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.116878 4731 generic.go:334] "Generic (PLEG): container finished" podID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerID="5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d" exitCode=0 Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.116922 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerDied","Data":"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.116938 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkdtc" event={"ID":"7e4faf4b-b94a-4903-b53c-9b4fa33b8052","Type":"ContainerDied","Data":"b498ecb9c3f87d2efc1ab1a9bb50528be384f4356bc8a8064287ae1155573fdf"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.117026 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkdtc" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.118946 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" event={"ID":"58c5709d-2320-4da3-a897-bf4289ed68ee","Type":"ContainerStarted","Data":"346477308d499a1834fb4b92e17decf030e9c054073fcbc71198699857733fa6"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.118971 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" event={"ID":"58c5709d-2320-4da3-a897-bf4289ed68ee","Type":"ContainerStarted","Data":"3ce1f326504f4caaf2e59164f819f5073ec7874db58c6651504a8df6458448ca"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.119610 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.120776 4731 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bzsv6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" start-of-body= Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.120803 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" podUID="58c5709d-2320-4da3-a897-bf4289ed68ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.121537 4731 generic.go:334] "Generic (PLEG): container finished" podID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerID="187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945" exitCode=0 Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.121583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerDied","Data":"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.121601 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jz9nw" event={"ID":"169fb2cd-829d-4f3e-8a08-33c431d6c3d1","Type":"ContainerDied","Data":"7c72073ca8b48ecf40d98be3dad1df3839a2a7d8a528b24c479bd8b1e2ff0d76"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.121668 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jz9nw" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.124316 4731 generic.go:334] "Generic (PLEG): container finished" podID="1fb9603e-2925-4558-ac8e-4877220963d5" containerID="d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435" exitCode=0 Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.124351 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerDied","Data":"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.124366 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkdzz" event={"ID":"1fb9603e-2925-4558-ac8e-4877220963d5","Type":"ContainerDied","Data":"9e5768c0ee4ae05ecbbdbb01ccf7eea4ca8511253adf2d78faf8415df5209096"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.124417 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkdzz" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.132833 4731 scope.go:117] "RemoveContainer" containerID="21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.137599 4731 generic.go:334] "Generic (PLEG): container finished" podID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerID="472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322" exitCode=0 Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.137744 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerDied","Data":"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.137798 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" event={"ID":"8788b686-0b60-4ad3-9e34-16f6fb03c2d0","Type":"ContainerDied","Data":"c5e913793c2ff945cdbf0a3bbdd8119ae72e6d652dd847439f06c98de604653a"} Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.137878 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-srnnr" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.141844 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" podStartSLOduration=1.141826349 podStartE2EDuration="1.141826349s" podCreationTimestamp="2025-12-03 18:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:59:24.141667374 +0000 UTC m=+284.740261838" watchObservedRunningTime="2025-12-03 18:59:24.141826349 +0000 UTC m=+284.740420813" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.153633 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.155051 4731 scope.go:117] "RemoveContainer" containerID="5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.172675 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9jch"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.185987 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.193094 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jz9nw"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.202065 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.206142 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zkdtc"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.208690 4731 scope.go:117] "RemoveContainer" containerID="0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.209066 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.209451 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75\": container with ID starting with 0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75 not found: ID does not exist" containerID="0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.209486 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75"} err="failed to get container status \"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75\": rpc error: code = NotFound desc = could not find container \"0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75\": container with ID starting with 0c661056dc9094f1c2b842adca34623eae1fcf459dcc4e38dee37c57065b4f75 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.209513 4731 scope.go:117] "RemoveContainer" containerID="21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.209754 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b\": container with ID starting with 21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b not found: ID does not exist" containerID="21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.209799 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b"} err="failed to get container status \"21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b\": rpc error: code = NotFound desc = could not find container \"21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b\": container with ID starting with 21284215c146cc46f289868c5d386041e2fd2d484460095dda178c9a6a2ed57b not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.209815 4731 scope.go:117] "RemoveContainer" containerID="5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.211074 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b\": container with ID starting with 5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b not found: ID does not exist" containerID="5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.211129 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b"} err="failed to get container status \"5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b\": rpc error: code = NotFound desc = could not find container \"5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b\": container with ID starting with 5afae1a47c6decc069260e4f628cd21379df34fdf2bc5cc94f15ab325dfe1e5b not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.211161 4731 scope.go:117] "RemoveContainer" containerID="5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.214365 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-srnnr"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.221277 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.224858 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mkdzz"] Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.225113 4731 scope.go:117] "RemoveContainer" containerID="75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.241375 4731 scope.go:117] "RemoveContainer" containerID="3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.256992 4731 scope.go:117] "RemoveContainer" containerID="5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.257365 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d\": container with ID starting with 5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d not found: ID does not exist" containerID="5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.257399 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d"} err="failed to get container status \"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d\": rpc error: code = NotFound desc = could not find container \"5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d\": container with ID starting with 5fbedf71d91745e15e883007573e807cd40f8bd244f2e3f1c19a2429f35b1c9d not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.257424 4731 scope.go:117] "RemoveContainer" containerID="75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.257700 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d\": container with ID starting with 75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d not found: ID does not exist" containerID="75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.257721 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d"} err="failed to get container status \"75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d\": rpc error: code = NotFound desc = could not find container \"75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d\": container with ID starting with 75b2968ca5fc1c54b1aa7b77d7eb6e2b047c0f96fd595172c2e36061718e019d not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.257736 4731 scope.go:117] "RemoveContainer" containerID="3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.258300 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6\": container with ID starting with 3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6 not found: ID does not exist" containerID="3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.258356 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6"} err="failed to get container status \"3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6\": rpc error: code = NotFound desc = could not find container \"3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6\": container with ID starting with 3bdd11dc9011671b53ed359841c239f8233fbbdbcaa0694557bc85b6a9319fd6 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.258387 4731 scope.go:117] "RemoveContainer" containerID="187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.280954 4731 scope.go:117] "RemoveContainer" containerID="858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.298438 4731 scope.go:117] "RemoveContainer" containerID="f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.321793 4731 scope.go:117] "RemoveContainer" containerID="187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.322398 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945\": container with ID starting with 187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945 not found: ID does not exist" containerID="187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.322451 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945"} err="failed to get container status \"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945\": rpc error: code = NotFound desc = could not find container \"187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945\": container with ID starting with 187d88f7ed68e04b56e5e3c1e18602c7be6ce5e590bf5e2142b6f43c1e527945 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.322511 4731 scope.go:117] "RemoveContainer" containerID="858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.322886 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058\": container with ID starting with 858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058 not found: ID does not exist" containerID="858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.322954 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058"} err="failed to get container status \"858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058\": rpc error: code = NotFound desc = could not find container \"858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058\": container with ID starting with 858415caf1f62a9c263d6f082e5dae07b71576e35d51d5457c4cb362efcb7058 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.322989 4731 scope.go:117] "RemoveContainer" containerID="f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.323229 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09\": container with ID starting with f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09 not found: ID does not exist" containerID="f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.323340 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09"} err="failed to get container status \"f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09\": rpc error: code = NotFound desc = could not find container \"f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09\": container with ID starting with f552bfce974e2953a6f2a32a5ea9889e9a3141d128f81d547b4f6a5ed8890a09 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.323360 4731 scope.go:117] "RemoveContainer" containerID="d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.335758 4731 scope.go:117] "RemoveContainer" containerID="46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.352755 4731 scope.go:117] "RemoveContainer" containerID="d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.373194 4731 scope.go:117] "RemoveContainer" containerID="d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.373599 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435\": container with ID starting with d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435 not found: ID does not exist" containerID="d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.373640 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435"} err="failed to get container status \"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435\": rpc error: code = NotFound desc = could not find container \"d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435\": container with ID starting with d5c48e68f63ba6759545cedd87087afd32664a40e08997b4cb922c2516cac435 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.373691 4731 scope.go:117] "RemoveContainer" containerID="46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.374069 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51\": container with ID starting with 46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51 not found: ID does not exist" containerID="46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.374121 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51"} err="failed to get container status \"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51\": rpc error: code = NotFound desc = could not find container \"46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51\": container with ID starting with 46ca79180c837d5795738a6a94c17f54e8f9e9926eb4f532f636045e01813a51 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.374152 4731 scope.go:117] "RemoveContainer" containerID="d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.374506 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935\": container with ID starting with d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935 not found: ID does not exist" containerID="d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.374529 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935"} err="failed to get container status \"d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935\": rpc error: code = NotFound desc = could not find container \"d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935\": container with ID starting with d33e0151b88722555af0137c7898d1b6e79b1a4a313f7a719154e5d304fd2935 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.374547 4731 scope.go:117] "RemoveContainer" containerID="472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.388209 4731 scope.go:117] "RemoveContainer" containerID="3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.400556 4731 scope.go:117] "RemoveContainer" containerID="472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.400914 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322\": container with ID starting with 472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322 not found: ID does not exist" containerID="472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.400958 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322"} err="failed to get container status \"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322\": rpc error: code = NotFound desc = could not find container \"472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322\": container with ID starting with 472728eec9b278edae45cfcd71e0a93daba406885359dbedb7e27dbec3f4c322 not found: ID does not exist" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.400989 4731 scope.go:117] "RemoveContainer" containerID="3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5" Dec 03 18:59:24 crc kubenswrapper[4731]: E1203 18:59:24.401263 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5\": container with ID starting with 3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5 not found: ID does not exist" containerID="3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5" Dec 03 18:59:24 crc kubenswrapper[4731]: I1203 18:59:24.401301 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5"} err="failed to get container status \"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5\": rpc error: code = NotFound desc = could not find container \"3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5\": container with ID starting with 3d3db5cbf573f7e3d180521b77c70766e79be9f2873c65303b0c5384d71478f5 not found: ID does not exist" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.152156 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bzsv6" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.264865 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sf62s"] Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265100 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265117 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265134 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265147 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265163 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265174 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265188 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265198 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265210 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265220 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265236 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265249 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265287 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265297 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265307 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265316 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265329 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265337 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265351 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265359 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="extract-content" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265370 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265379 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265393 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265401 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265410 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265418 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="extract-utilities" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265528 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265540 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265551 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265562 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265575 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265589 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" containerName="registry-server" Dec 03 18:59:25 crc kubenswrapper[4731]: E1203 18:59:25.265713 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.265724 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" containerName="marketplace-operator" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.266635 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.270023 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.283184 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf62s"] Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.391499 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-utilities\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.391585 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvjv\" (UniqueName: \"kubernetes.io/projected/a94f173d-5304-4cd9-bdfc-2dfb032b154c-kube-api-access-hqvjv\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.391739 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-catalog-content\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.463387 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljts9"] Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.464832 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.467701 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.475779 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljts9"] Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.493080 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvjv\" (UniqueName: \"kubernetes.io/projected/a94f173d-5304-4cd9-bdfc-2dfb032b154c-kube-api-access-hqvjv\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.493201 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-catalog-content\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.493279 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-utilities\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.494058 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-utilities\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.494865 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94f173d-5304-4cd9-bdfc-2dfb032b154c-catalog-content\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.551358 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvjv\" (UniqueName: \"kubernetes.io/projected/a94f173d-5304-4cd9-bdfc-2dfb032b154c-kube-api-access-hqvjv\") pod \"community-operators-sf62s\" (UID: \"a94f173d-5304-4cd9-bdfc-2dfb032b154c\") " pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.582949 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.594991 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-utilities\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.595066 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-catalog-content\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.595112 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8db\" (UniqueName: \"kubernetes.io/projected/87bd198b-ff22-4e12-86ab-dfb52adbe31c-kube-api-access-rh8db\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.696408 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-utilities\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.696760 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-catalog-content\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.696788 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8db\" (UniqueName: \"kubernetes.io/projected/87bd198b-ff22-4e12-86ab-dfb52adbe31c-kube-api-access-rh8db\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.697438 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-utilities\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.697688 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87bd198b-ff22-4e12-86ab-dfb52adbe31c-catalog-content\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.731156 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8db\" (UniqueName: \"kubernetes.io/projected/87bd198b-ff22-4e12-86ab-dfb52adbe31c-kube-api-access-rh8db\") pod \"certified-operators-ljts9\" (UID: \"87bd198b-ff22-4e12-86ab-dfb52adbe31c\") " pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.790699 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.866673 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="169fb2cd-829d-4f3e-8a08-33c431d6c3d1" path="/var/lib/kubelet/pods/169fb2cd-829d-4f3e-8a08-33c431d6c3d1/volumes" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.867390 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb9603e-2925-4558-ac8e-4877220963d5" path="/var/lib/kubelet/pods/1fb9603e-2925-4558-ac8e-4877220963d5/volumes" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.868093 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7311d0b6-0888-4788-974d-6f1e971123eb" path="/var/lib/kubelet/pods/7311d0b6-0888-4788-974d-6f1e971123eb/volumes" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.869482 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4faf4b-b94a-4903-b53c-9b4fa33b8052" path="/var/lib/kubelet/pods/7e4faf4b-b94a-4903-b53c-9b4fa33b8052/volumes" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.872477 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8788b686-0b60-4ad3-9e34-16f6fb03c2d0" path="/var/lib/kubelet/pods/8788b686-0b60-4ad3-9e34-16f6fb03c2d0/volumes" Dec 03 18:59:25 crc kubenswrapper[4731]: I1203 18:59:25.998700 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf62s"] Dec 03 18:59:26 crc kubenswrapper[4731]: W1203 18:59:26.002076 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda94f173d_5304_4cd9_bdfc_2dfb032b154c.slice/crio-1f6d447a4e3a13e1b95446a0b42d9d5df19b35d222e3a473cfcd01bca74fe59f WatchSource:0}: Error finding container 1f6d447a4e3a13e1b95446a0b42d9d5df19b35d222e3a473cfcd01bca74fe59f: Status 404 returned error can't find the container with id 1f6d447a4e3a13e1b95446a0b42d9d5df19b35d222e3a473cfcd01bca74fe59f Dec 03 18:59:26 crc kubenswrapper[4731]: I1203 18:59:26.154199 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf62s" event={"ID":"a94f173d-5304-4cd9-bdfc-2dfb032b154c","Type":"ContainerStarted","Data":"1f6d447a4e3a13e1b95446a0b42d9d5df19b35d222e3a473cfcd01bca74fe59f"} Dec 03 18:59:26 crc kubenswrapper[4731]: I1203 18:59:26.191378 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljts9"] Dec 03 18:59:26 crc kubenswrapper[4731]: W1203 18:59:26.197538 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87bd198b_ff22_4e12_86ab_dfb52adbe31c.slice/crio-2aba7558c94bc1bd019aa22a0cc785a91a1a669d6cdfe261da4c5065113e430e WatchSource:0}: Error finding container 2aba7558c94bc1bd019aa22a0cc785a91a1a669d6cdfe261da4c5065113e430e: Status 404 returned error can't find the container with id 2aba7558c94bc1bd019aa22a0cc785a91a1a669d6cdfe261da4c5065113e430e Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.159013 4731 generic.go:334] "Generic (PLEG): container finished" podID="87bd198b-ff22-4e12-86ab-dfb52adbe31c" containerID="a36bd920464d8b3157ecd8833e0baa998effbfbc9ecf6e991bb440132405eaac" exitCode=0 Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.159050 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljts9" event={"ID":"87bd198b-ff22-4e12-86ab-dfb52adbe31c","Type":"ContainerDied","Data":"a36bd920464d8b3157ecd8833e0baa998effbfbc9ecf6e991bb440132405eaac"} Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.159090 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljts9" event={"ID":"87bd198b-ff22-4e12-86ab-dfb52adbe31c","Type":"ContainerStarted","Data":"2aba7558c94bc1bd019aa22a0cc785a91a1a669d6cdfe261da4c5065113e430e"} Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.161277 4731 generic.go:334] "Generic (PLEG): container finished" podID="a94f173d-5304-4cd9-bdfc-2dfb032b154c" containerID="4f09110bcd38be7efda20e5e85e4a18450ce296bbaee646fb1577b73dcae02cf" exitCode=0 Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.161464 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf62s" event={"ID":"a94f173d-5304-4cd9-bdfc-2dfb032b154c","Type":"ContainerDied","Data":"4f09110bcd38be7efda20e5e85e4a18450ce296bbaee646fb1577b73dcae02cf"} Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.656174 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bm84h"] Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.657198 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.661841 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.665619 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm84h"] Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.722759 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk5tg\" (UniqueName: \"kubernetes.io/projected/5740f025-332a-4be3-8473-ec656326c634-kube-api-access-qk5tg\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.722848 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-catalog-content\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.722885 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-utilities\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.824653 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5tg\" (UniqueName: \"kubernetes.io/projected/5740f025-332a-4be3-8473-ec656326c634-kube-api-access-qk5tg\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.824712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-catalog-content\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.824740 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-utilities\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.825208 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-utilities\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.825312 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5740f025-332a-4be3-8473-ec656326c634-catalog-content\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.844282 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5tg\" (UniqueName: \"kubernetes.io/projected/5740f025-332a-4be3-8473-ec656326c634-kube-api-access-qk5tg\") pod \"redhat-operators-bm84h\" (UID: \"5740f025-332a-4be3-8473-ec656326c634\") " pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.871568 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q7flg"] Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.873747 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.877732 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.882175 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7flg"] Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.925488 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-catalog-content\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.925833 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6fj\" (UniqueName: \"kubernetes.io/projected/cbb9f0c2-c760-4f02-81d0-37194af5c296-kube-api-access-nd6fj\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:27 crc kubenswrapper[4731]: I1203 18:59:27.925888 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-utilities\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.004717 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.026864 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6fj\" (UniqueName: \"kubernetes.io/projected/cbb9f0c2-c760-4f02-81d0-37194af5c296-kube-api-access-nd6fj\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.026917 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-utilities\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.026981 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-catalog-content\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.027500 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-catalog-content\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.028050 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb9f0c2-c760-4f02-81d0-37194af5c296-utilities\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.049399 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6fj\" (UniqueName: \"kubernetes.io/projected/cbb9f0c2-c760-4f02-81d0-37194af5c296-kube-api-access-nd6fj\") pod \"redhat-marketplace-q7flg\" (UID: \"cbb9f0c2-c760-4f02-81d0-37194af5c296\") " pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.175581 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf62s" event={"ID":"a94f173d-5304-4cd9-bdfc-2dfb032b154c","Type":"ContainerStarted","Data":"cc22e618a3af91ba10c9c27eef2a071c97356959ffc1ca02738652006ca57809"} Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.179942 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljts9" event={"ID":"87bd198b-ff22-4e12-86ab-dfb52adbe31c","Type":"ContainerStarted","Data":"f5ecefd46655370627d4f528afeb4c1f6244e4c94410fa73126d4117a1924e5c"} Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.250030 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.405453 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm84h"] Dec 03 18:59:28 crc kubenswrapper[4731]: W1203 18:59:28.412734 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5740f025_332a_4be3_8473_ec656326c634.slice/crio-5de0d6bb228251e156bf6079767095ddd77705de95feb59e33c7625db1e0ea89 WatchSource:0}: Error finding container 5de0d6bb228251e156bf6079767095ddd77705de95feb59e33c7625db1e0ea89: Status 404 returned error can't find the container with id 5de0d6bb228251e156bf6079767095ddd77705de95feb59e33c7625db1e0ea89 Dec 03 18:59:28 crc kubenswrapper[4731]: I1203 18:59:28.644893 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7flg"] Dec 03 18:59:28 crc kubenswrapper[4731]: W1203 18:59:28.650710 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb9f0c2_c760_4f02_81d0_37194af5c296.slice/crio-068c7f164c8aefcfa948ef681f277206653dd465acbc8f24dae49dabc6fafb54 WatchSource:0}: Error finding container 068c7f164c8aefcfa948ef681f277206653dd465acbc8f24dae49dabc6fafb54: Status 404 returned error can't find the container with id 068c7f164c8aefcfa948ef681f277206653dd465acbc8f24dae49dabc6fafb54 Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.186308 4731 generic.go:334] "Generic (PLEG): container finished" podID="87bd198b-ff22-4e12-86ab-dfb52adbe31c" containerID="f5ecefd46655370627d4f528afeb4c1f6244e4c94410fa73126d4117a1924e5c" exitCode=0 Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.186585 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljts9" event={"ID":"87bd198b-ff22-4e12-86ab-dfb52adbe31c","Type":"ContainerDied","Data":"f5ecefd46655370627d4f528afeb4c1f6244e4c94410fa73126d4117a1924e5c"} Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.194441 4731 generic.go:334] "Generic (PLEG): container finished" podID="a94f173d-5304-4cd9-bdfc-2dfb032b154c" containerID="cc22e618a3af91ba10c9c27eef2a071c97356959ffc1ca02738652006ca57809" exitCode=0 Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.194525 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf62s" event={"ID":"a94f173d-5304-4cd9-bdfc-2dfb032b154c","Type":"ContainerDied","Data":"cc22e618a3af91ba10c9c27eef2a071c97356959ffc1ca02738652006ca57809"} Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.197480 4731 generic.go:334] "Generic (PLEG): container finished" podID="5740f025-332a-4be3-8473-ec656326c634" containerID="b85a5493ed66d7e6602e339faba053d86d5e0d30cfaf7d3bb3d5f3e0361eb3c4" exitCode=0 Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.197519 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm84h" event={"ID":"5740f025-332a-4be3-8473-ec656326c634","Type":"ContainerDied","Data":"b85a5493ed66d7e6602e339faba053d86d5e0d30cfaf7d3bb3d5f3e0361eb3c4"} Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.197537 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm84h" event={"ID":"5740f025-332a-4be3-8473-ec656326c634","Type":"ContainerStarted","Data":"5de0d6bb228251e156bf6079767095ddd77705de95feb59e33c7625db1e0ea89"} Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.199625 4731 generic.go:334] "Generic (PLEG): container finished" podID="cbb9f0c2-c760-4f02-81d0-37194af5c296" containerID="5b1b418fd5e2d02ab924b97c099d11e25d224da0f9aface525d8bbab780d801d" exitCode=0 Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.199677 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7flg" event={"ID":"cbb9f0c2-c760-4f02-81d0-37194af5c296","Type":"ContainerDied","Data":"5b1b418fd5e2d02ab924b97c099d11e25d224da0f9aface525d8bbab780d801d"} Dec 03 18:59:29 crc kubenswrapper[4731]: I1203 18:59:29.199707 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7flg" event={"ID":"cbb9f0c2-c760-4f02-81d0-37194af5c296","Type":"ContainerStarted","Data":"068c7f164c8aefcfa948ef681f277206653dd465acbc8f24dae49dabc6fafb54"} Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.206314 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm84h" event={"ID":"5740f025-332a-4be3-8473-ec656326c634","Type":"ContainerStarted","Data":"bc5ebd79169c8a5b555be3ccb09b03d8d98cb3c0d82a9dbbe38d321ec140b645"} Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.216105 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7flg" event={"ID":"cbb9f0c2-c760-4f02-81d0-37194af5c296","Type":"ContainerStarted","Data":"8f2a0b354dfb36f5c86d29aafda29b0518027703ca1bafe7afa6153d4819d40e"} Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.218187 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljts9" event={"ID":"87bd198b-ff22-4e12-86ab-dfb52adbe31c","Type":"ContainerStarted","Data":"5a848b537b747d1c2ad1217734b10eaa157c3d34ebaec2e9c5ede3c589191285"} Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.220601 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf62s" event={"ID":"a94f173d-5304-4cd9-bdfc-2dfb032b154c","Type":"ContainerStarted","Data":"e17c6170802d9dde8d63f4d58c42d1daf1bbb613b380e16145aef2f0af0c2d05"} Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.252005 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljts9" podStartSLOduration=2.800871502 podStartE2EDuration="5.251984451s" podCreationTimestamp="2025-12-03 18:59:25 +0000 UTC" firstStartedPulling="2025-12-03 18:59:27.161766772 +0000 UTC m=+287.760361236" lastFinishedPulling="2025-12-03 18:59:29.612879721 +0000 UTC m=+290.211474185" observedRunningTime="2025-12-03 18:59:30.248476957 +0000 UTC m=+290.847071421" watchObservedRunningTime="2025-12-03 18:59:30.251984451 +0000 UTC m=+290.850578915" Dec 03 18:59:30 crc kubenswrapper[4731]: I1203 18:59:30.288048 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sf62s" podStartSLOduration=2.845574639 podStartE2EDuration="5.288030598s" podCreationTimestamp="2025-12-03 18:59:25 +0000 UTC" firstStartedPulling="2025-12-03 18:59:27.166145144 +0000 UTC m=+287.764739608" lastFinishedPulling="2025-12-03 18:59:29.608601103 +0000 UTC m=+290.207195567" observedRunningTime="2025-12-03 18:59:30.285907229 +0000 UTC m=+290.884501713" watchObservedRunningTime="2025-12-03 18:59:30.288030598 +0000 UTC m=+290.886625072" Dec 03 18:59:31 crc kubenswrapper[4731]: I1203 18:59:31.229518 4731 generic.go:334] "Generic (PLEG): container finished" podID="5740f025-332a-4be3-8473-ec656326c634" containerID="bc5ebd79169c8a5b555be3ccb09b03d8d98cb3c0d82a9dbbe38d321ec140b645" exitCode=0 Dec 03 18:59:31 crc kubenswrapper[4731]: I1203 18:59:31.229629 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm84h" event={"ID":"5740f025-332a-4be3-8473-ec656326c634","Type":"ContainerDied","Data":"bc5ebd79169c8a5b555be3ccb09b03d8d98cb3c0d82a9dbbe38d321ec140b645"} Dec 03 18:59:31 crc kubenswrapper[4731]: I1203 18:59:31.237930 4731 generic.go:334] "Generic (PLEG): container finished" podID="cbb9f0c2-c760-4f02-81d0-37194af5c296" containerID="8f2a0b354dfb36f5c86d29aafda29b0518027703ca1bafe7afa6153d4819d40e" exitCode=0 Dec 03 18:59:31 crc kubenswrapper[4731]: I1203 18:59:31.239551 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7flg" event={"ID":"cbb9f0c2-c760-4f02-81d0-37194af5c296","Type":"ContainerDied","Data":"8f2a0b354dfb36f5c86d29aafda29b0518027703ca1bafe7afa6153d4819d40e"} Dec 03 18:59:33 crc kubenswrapper[4731]: I1203 18:59:33.263294 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm84h" event={"ID":"5740f025-332a-4be3-8473-ec656326c634","Type":"ContainerStarted","Data":"b0078f36c3a11f2449cf99011153826f2adda786bf8ea02f887fd05294a08883"} Dec 03 18:59:33 crc kubenswrapper[4731]: I1203 18:59:33.265541 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7flg" event={"ID":"cbb9f0c2-c760-4f02-81d0-37194af5c296","Type":"ContainerStarted","Data":"40b52692bb4a9345160b468f1eff220bcaf1aa2ccd836a697aaa042e741c8c53"} Dec 03 18:59:33 crc kubenswrapper[4731]: I1203 18:59:33.286137 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bm84h" podStartSLOduration=3.850073272 podStartE2EDuration="6.286121934s" podCreationTimestamp="2025-12-03 18:59:27 +0000 UTC" firstStartedPulling="2025-12-03 18:59:29.199743757 +0000 UTC m=+289.798338221" lastFinishedPulling="2025-12-03 18:59:31.635792429 +0000 UTC m=+292.234386883" observedRunningTime="2025-12-03 18:59:33.285175274 +0000 UTC m=+293.883769748" watchObservedRunningTime="2025-12-03 18:59:33.286121934 +0000 UTC m=+293.884716388" Dec 03 18:59:33 crc kubenswrapper[4731]: I1203 18:59:33.304076 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q7flg" podStartSLOduration=3.838567109 podStartE2EDuration="6.304058065s" podCreationTimestamp="2025-12-03 18:59:27 +0000 UTC" firstStartedPulling="2025-12-03 18:59:29.202157275 +0000 UTC m=+289.800751739" lastFinishedPulling="2025-12-03 18:59:31.667648231 +0000 UTC m=+292.266242695" observedRunningTime="2025-12-03 18:59:33.303457975 +0000 UTC m=+293.902052439" watchObservedRunningTime="2025-12-03 18:59:33.304058065 +0000 UTC m=+293.902652529" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.584084 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.584466 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.631949 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.792420 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.792780 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:35 crc kubenswrapper[4731]: I1203 18:59:35.833799 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:36 crc kubenswrapper[4731]: I1203 18:59:36.319877 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sf62s" Dec 03 18:59:36 crc kubenswrapper[4731]: I1203 18:59:36.321879 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljts9" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.005485 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.005941 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.056943 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.250998 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.251080 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.290466 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.330843 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q7flg" Dec 03 18:59:38 crc kubenswrapper[4731]: I1203 18:59:38.331208 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bm84h" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.532724 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nl84r"] Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.533933 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.556267 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nl84r"] Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682107 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682176 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-certificates\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682194 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31c8f722-a4ab-4d77-a1b4-17bd194beff5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682221 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31c8f722-a4ab-4d77-a1b4-17bd194beff5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682248 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-trusted-ca\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682285 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-tls\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682308 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t852p\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-kube-api-access-t852p\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.682329 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-bound-sa-token\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.773064 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.783689 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-certificates\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.783763 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31c8f722-a4ab-4d77-a1b4-17bd194beff5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.783975 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31c8f722-a4ab-4d77-a1b4-17bd194beff5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.784609 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31c8f722-a4ab-4d77-a1b4-17bd194beff5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.784012 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-trusted-ca\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.784832 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-tls\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.784864 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t852p\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-kube-api-access-t852p\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.784883 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-bound-sa-token\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.785036 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-certificates\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.785144 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31c8f722-a4ab-4d77-a1b4-17bd194beff5-trusted-ca\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.791870 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31c8f722-a4ab-4d77-a1b4-17bd194beff5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.791870 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-registry-tls\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.811764 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-bound-sa-token\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.815956 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t852p\" (UniqueName: \"kubernetes.io/projected/31c8f722-a4ab-4d77-a1b4-17bd194beff5-kube-api-access-t852p\") pod \"image-registry-66df7c8f76-nl84r\" (UID: \"31c8f722-a4ab-4d77-a1b4-17bd194beff5\") " pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:42 crc kubenswrapper[4731]: I1203 18:59:42.849350 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:43 crc kubenswrapper[4731]: I1203 18:59:43.321666 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nl84r"] Dec 03 18:59:43 crc kubenswrapper[4731]: W1203 18:59:43.330055 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c8f722_a4ab_4d77_a1b4_17bd194beff5.slice/crio-e72fc96a040c3e469d39f7a7b708c1b44c49374cb008ef44f2ca46484087f150 WatchSource:0}: Error finding container e72fc96a040c3e469d39f7a7b708c1b44c49374cb008ef44f2ca46484087f150: Status 404 returned error can't find the container with id e72fc96a040c3e469d39f7a7b708c1b44c49374cb008ef44f2ca46484087f150 Dec 03 18:59:44 crc kubenswrapper[4731]: I1203 18:59:44.316879 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" event={"ID":"31c8f722-a4ab-4d77-a1b4-17bd194beff5","Type":"ContainerStarted","Data":"e72fc96a040c3e469d39f7a7b708c1b44c49374cb008ef44f2ca46484087f150"} Dec 03 18:59:45 crc kubenswrapper[4731]: I1203 18:59:45.324141 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" event={"ID":"31c8f722-a4ab-4d77-a1b4-17bd194beff5","Type":"ContainerStarted","Data":"173d052a6a258d6f0d55c87d11f9a6a07d59f67a9a8c7389881a25ceb9bf199b"} Dec 03 18:59:45 crc kubenswrapper[4731]: I1203 18:59:45.324306 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 18:59:45 crc kubenswrapper[4731]: I1203 18:59:45.346669 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" podStartSLOduration=3.346650467 podStartE2EDuration="3.346650467s" podCreationTimestamp="2025-12-03 18:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:59:45.342895166 +0000 UTC m=+305.941489640" watchObservedRunningTime="2025-12-03 18:59:45.346650467 +0000 UTC m=+305.945244921" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.170510 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5"] Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.172207 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.174781 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.177074 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.184987 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5"] Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.239844 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.239933 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tg5t\" (UniqueName: \"kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.239971 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.341210 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.341573 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tg5t\" (UniqueName: \"kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.341674 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.342157 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.352217 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.357515 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tg5t\" (UniqueName: \"kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t\") pod \"collect-profiles-29413140-ln7m5\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.494512 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:00 crc kubenswrapper[4731]: I1203 19:00:00.922857 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5"] Dec 03 19:00:01 crc kubenswrapper[4731]: I1203 19:00:01.416911 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" event={"ID":"76fea8e6-1b0c-4040-b767-fc7f6205d4ab","Type":"ContainerStarted","Data":"7f74847f85b4bc51ed2d81b1abac22e5b3fded26ded2af960e406940f234f35e"} Dec 03 19:00:02 crc kubenswrapper[4731]: I1203 19:00:02.435984 4731 generic.go:334] "Generic (PLEG): container finished" podID="76fea8e6-1b0c-4040-b767-fc7f6205d4ab" containerID="eeebf7b4a970e04e0250d523e60ace8de9dbd7afcdb2437bf4a2f045e760256d" exitCode=0 Dec 03 19:00:02 crc kubenswrapper[4731]: I1203 19:00:02.436097 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" event={"ID":"76fea8e6-1b0c-4040-b767-fc7f6205d4ab","Type":"ContainerDied","Data":"eeebf7b4a970e04e0250d523e60ace8de9dbd7afcdb2437bf4a2f045e760256d"} Dec 03 19:00:02 crc kubenswrapper[4731]: I1203 19:00:02.854765 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nl84r" Dec 03 19:00:02 crc kubenswrapper[4731]: I1203 19:00:02.901225 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.672903 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.787832 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume\") pod \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.787873 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tg5t\" (UniqueName: \"kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t\") pod \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.787925 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume\") pod \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\" (UID: \"76fea8e6-1b0c-4040-b767-fc7f6205d4ab\") " Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.788707 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "76fea8e6-1b0c-4040-b767-fc7f6205d4ab" (UID: "76fea8e6-1b0c-4040-b767-fc7f6205d4ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.793420 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t" (OuterVolumeSpecName: "kube-api-access-4tg5t") pod "76fea8e6-1b0c-4040-b767-fc7f6205d4ab" (UID: "76fea8e6-1b0c-4040-b767-fc7f6205d4ab"). InnerVolumeSpecName "kube-api-access-4tg5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.793568 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76fea8e6-1b0c-4040-b767-fc7f6205d4ab" (UID: "76fea8e6-1b0c-4040-b767-fc7f6205d4ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.889900 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.889931 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tg5t\" (UniqueName: \"kubernetes.io/projected/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-kube-api-access-4tg5t\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:03 crc kubenswrapper[4731]: I1203 19:00:03.889944 4731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76fea8e6-1b0c-4040-b767-fc7f6205d4ab-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:04 crc kubenswrapper[4731]: I1203 19:00:04.460909 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" event={"ID":"76fea8e6-1b0c-4040-b767-fc7f6205d4ab","Type":"ContainerDied","Data":"7f74847f85b4bc51ed2d81b1abac22e5b3fded26ded2af960e406940f234f35e"} Dec 03 19:00:04 crc kubenswrapper[4731]: I1203 19:00:04.460968 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f74847f85b4bc51ed2d81b1abac22e5b3fded26ded2af960e406940f234f35e" Dec 03 19:00:04 crc kubenswrapper[4731]: I1203 19:00:04.460966 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5" Dec 03 19:00:26 crc kubenswrapper[4731]: I1203 19:00:26.468407 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:00:26 crc kubenswrapper[4731]: I1203 19:00:26.469211 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:00:27 crc kubenswrapper[4731]: I1203 19:00:27.948008 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" podUID="63540dce-b2ef-48ab-9aad-dc0afcbec369" containerName="registry" containerID="cri-o://08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb" gracePeriod=30 Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.282039 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.325522 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.327034 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.427271 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.427726 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.427933 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.428070 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.428209 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbp72\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.428379 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.428480 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.428607 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls\") pod \"63540dce-b2ef-48ab-9aad-dc0afcbec369\" (UID: \"63540dce-b2ef-48ab-9aad-dc0afcbec369\") " Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.429137 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.429386 4731 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.434284 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72" (OuterVolumeSpecName: "kube-api-access-pbp72") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "kube-api-access-pbp72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.434842 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.435522 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.435815 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.449174 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.449923 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "63540dce-b2ef-48ab-9aad-dc0afcbec369" (UID: "63540dce-b2ef-48ab-9aad-dc0afcbec369"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.530653 4731 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.530714 4731 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63540dce-b2ef-48ab-9aad-dc0afcbec369-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.530726 4731 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.530735 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbp72\" (UniqueName: \"kubernetes.io/projected/63540dce-b2ef-48ab-9aad-dc0afcbec369-kube-api-access-pbp72\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.530748 4731 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63540dce-b2ef-48ab-9aad-dc0afcbec369-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.601320 4731 generic.go:334] "Generic (PLEG): container finished" podID="63540dce-b2ef-48ab-9aad-dc0afcbec369" containerID="08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb" exitCode=0 Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.601456 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.601496 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" event={"ID":"63540dce-b2ef-48ab-9aad-dc0afcbec369","Type":"ContainerDied","Data":"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb"} Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.601991 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlh5v" event={"ID":"63540dce-b2ef-48ab-9aad-dc0afcbec369","Type":"ContainerDied","Data":"9688aa614828f39713fce947ff6fcab8e091ce6e0a330a12b13ade1829e390dc"} Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.602034 4731 scope.go:117] "RemoveContainer" containerID="08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.619697 4731 scope.go:117] "RemoveContainer" containerID="08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb" Dec 03 19:00:28 crc kubenswrapper[4731]: E1203 19:00:28.620698 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb\": container with ID starting with 08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb not found: ID does not exist" containerID="08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.620772 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb"} err="failed to get container status \"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb\": rpc error: code = NotFound desc = could not find container \"08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb\": container with ID starting with 08bf5f72dd5dd34d0d432f264d02ee8b836681cf026744249407a7f7881f37eb not found: ID does not exist" Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.635355 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 19:00:28 crc kubenswrapper[4731]: I1203 19:00:28.642630 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlh5v"] Dec 03 19:00:29 crc kubenswrapper[4731]: I1203 19:00:29.867952 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63540dce-b2ef-48ab-9aad-dc0afcbec369" path="/var/lib/kubelet/pods/63540dce-b2ef-48ab-9aad-dc0afcbec369/volumes" Dec 03 19:00:56 crc kubenswrapper[4731]: I1203 19:00:56.468958 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:00:56 crc kubenswrapper[4731]: I1203 19:00:56.469910 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.468854 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.469629 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.469730 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.470765 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.470890 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256" gracePeriod=600 Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.983798 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256" exitCode=0 Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.983881 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256"} Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.984018 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a"} Dec 03 19:01:26 crc kubenswrapper[4731]: I1203 19:01:26.984043 4731 scope.go:117] "RemoveContainer" containerID="c98a346677b22d07d6f17286a0d2db33d97d9ba9acb50c45c802074e052c21ad" Dec 03 19:03:26 crc kubenswrapper[4731]: I1203 19:03:26.469106 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:03:26 crc kubenswrapper[4731]: I1203 19:03:26.469831 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:03:56 crc kubenswrapper[4731]: I1203 19:03:56.469127 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:03:56 crc kubenswrapper[4731]: I1203 19:03:56.469964 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:04:26 crc kubenswrapper[4731]: I1203 19:04:26.469574 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:04:26 crc kubenswrapper[4731]: I1203 19:04:26.470619 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:04:26 crc kubenswrapper[4731]: I1203 19:04:26.470709 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:04:26 crc kubenswrapper[4731]: I1203 19:04:26.471844 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:04:26 crc kubenswrapper[4731]: I1203 19:04:26.471960 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a" gracePeriod=600 Dec 03 19:04:27 crc kubenswrapper[4731]: I1203 19:04:27.186890 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a" exitCode=0 Dec 03 19:04:27 crc kubenswrapper[4731]: I1203 19:04:27.186985 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a"} Dec 03 19:04:27 crc kubenswrapper[4731]: I1203 19:04:27.187407 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b"} Dec 03 19:04:27 crc kubenswrapper[4731]: I1203 19:04:27.187440 4731 scope.go:117] "RemoveContainer" containerID="a2a4fcd1c819b27038a8a6c4b9535c35827a67a748cd677cace1813cd153b256" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.111132 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ffkkn"] Dec 03 19:04:34 crc kubenswrapper[4731]: E1203 19:04:34.112061 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fea8e6-1b0c-4040-b767-fc7f6205d4ab" containerName="collect-profiles" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.112081 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fea8e6-1b0c-4040-b767-fc7f6205d4ab" containerName="collect-profiles" Dec 03 19:04:34 crc kubenswrapper[4731]: E1203 19:04:34.112110 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63540dce-b2ef-48ab-9aad-dc0afcbec369" containerName="registry" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.112118 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="63540dce-b2ef-48ab-9aad-dc0afcbec369" containerName="registry" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.112230 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="63540dce-b2ef-48ab-9aad-dc0afcbec369" containerName="registry" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.112274 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fea8e6-1b0c-4040-b767-fc7f6205d4ab" containerName="collect-profiles" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.112757 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.120617 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.121577 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ffkkn"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.121931 4731 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7stxs" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.122273 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.132330 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-td96h"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.134658 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-td96h" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.137683 4731 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hhntm" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.147747 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-td96h"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.159637 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vfp9"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.160671 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.169933 4731 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dlwnl" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.177452 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vfp9"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.239815 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfsg4\" (UniqueName: \"kubernetes.io/projected/670ef28e-2fa7-479f-9d0c-65164095dda5-kube-api-access-zfsg4\") pod \"cert-manager-cainjector-7f985d654d-ffkkn\" (UID: \"670ef28e-2fa7-479f-9d0c-65164095dda5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.239874 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknj4\" (UniqueName: \"kubernetes.io/projected/a015a221-3196-466d-b58f-79a0b04104ec-kube-api-access-fknj4\") pod \"cert-manager-5b446d88c5-td96h\" (UID: \"a015a221-3196-466d-b58f-79a0b04104ec\") " pod="cert-manager/cert-manager-5b446d88c5-td96h" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.239931 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pnd\" (UniqueName: \"kubernetes.io/projected/50c20406-8225-4008-a120-1e075514ef8d-kube-api-access-d6pnd\") pod \"cert-manager-webhook-5655c58dd6-2vfp9\" (UID: \"50c20406-8225-4008-a120-1e075514ef8d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.340525 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfsg4\" (UniqueName: \"kubernetes.io/projected/670ef28e-2fa7-479f-9d0c-65164095dda5-kube-api-access-zfsg4\") pod \"cert-manager-cainjector-7f985d654d-ffkkn\" (UID: \"670ef28e-2fa7-479f-9d0c-65164095dda5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.340585 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknj4\" (UniqueName: \"kubernetes.io/projected/a015a221-3196-466d-b58f-79a0b04104ec-kube-api-access-fknj4\") pod \"cert-manager-5b446d88c5-td96h\" (UID: \"a015a221-3196-466d-b58f-79a0b04104ec\") " pod="cert-manager/cert-manager-5b446d88c5-td96h" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.340628 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pnd\" (UniqueName: \"kubernetes.io/projected/50c20406-8225-4008-a120-1e075514ef8d-kube-api-access-d6pnd\") pod \"cert-manager-webhook-5655c58dd6-2vfp9\" (UID: \"50c20406-8225-4008-a120-1e075514ef8d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.361903 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pnd\" (UniqueName: \"kubernetes.io/projected/50c20406-8225-4008-a120-1e075514ef8d-kube-api-access-d6pnd\") pod \"cert-manager-webhook-5655c58dd6-2vfp9\" (UID: \"50c20406-8225-4008-a120-1e075514ef8d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.362103 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfsg4\" (UniqueName: \"kubernetes.io/projected/670ef28e-2fa7-479f-9d0c-65164095dda5-kube-api-access-zfsg4\") pod \"cert-manager-cainjector-7f985d654d-ffkkn\" (UID: \"670ef28e-2fa7-479f-9d0c-65164095dda5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.362533 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknj4\" (UniqueName: \"kubernetes.io/projected/a015a221-3196-466d-b58f-79a0b04104ec-kube-api-access-fknj4\") pod \"cert-manager-5b446d88c5-td96h\" (UID: \"a015a221-3196-466d-b58f-79a0b04104ec\") " pod="cert-manager/cert-manager-5b446d88c5-td96h" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.436648 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.452884 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-td96h" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.480577 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.754570 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-td96h"] Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.765545 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:04:34 crc kubenswrapper[4731]: I1203 19:04:34.901187 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-ffkkn"] Dec 03 19:04:34 crc kubenswrapper[4731]: W1203 19:04:34.910442 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670ef28e_2fa7_479f_9d0c_65164095dda5.slice/crio-85f31779eccd283b6459774341f2ea2d7a63611f6a59a41a6df9dedb78df1fce WatchSource:0}: Error finding container 85f31779eccd283b6459774341f2ea2d7a63611f6a59a41a6df9dedb78df1fce: Status 404 returned error can't find the container with id 85f31779eccd283b6459774341f2ea2d7a63611f6a59a41a6df9dedb78df1fce Dec 03 19:04:35 crc kubenswrapper[4731]: I1203 19:04:35.014024 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vfp9"] Dec 03 19:04:35 crc kubenswrapper[4731]: W1203 19:04:35.014742 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c20406_8225_4008_a120_1e075514ef8d.slice/crio-788ba4a3dac9ec3a28e03f52a38c9a38544f261b72ed9711181ee878fcdd7626 WatchSource:0}: Error finding container 788ba4a3dac9ec3a28e03f52a38c9a38544f261b72ed9711181ee878fcdd7626: Status 404 returned error can't find the container with id 788ba4a3dac9ec3a28e03f52a38c9a38544f261b72ed9711181ee878fcdd7626 Dec 03 19:04:35 crc kubenswrapper[4731]: I1203 19:04:35.243120 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" event={"ID":"670ef28e-2fa7-479f-9d0c-65164095dda5","Type":"ContainerStarted","Data":"85f31779eccd283b6459774341f2ea2d7a63611f6a59a41a6df9dedb78df1fce"} Dec 03 19:04:35 crc kubenswrapper[4731]: I1203 19:04:35.246544 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-td96h" event={"ID":"a015a221-3196-466d-b58f-79a0b04104ec","Type":"ContainerStarted","Data":"b32a8db2bf23fe8a912b0f06f2fbfbae87299b6ce8f92e9bb3f2f673c75cf557"} Dec 03 19:04:35 crc kubenswrapper[4731]: I1203 19:04:35.247856 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" event={"ID":"50c20406-8225-4008-a120-1e075514ef8d","Type":"ContainerStarted","Data":"788ba4a3dac9ec3a28e03f52a38c9a38544f261b72ed9711181ee878fcdd7626"} Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.270655 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" event={"ID":"670ef28e-2fa7-479f-9d0c-65164095dda5","Type":"ContainerStarted","Data":"efba054607388cbfe64b5eafbbf3678b94e480950cb3b1824cf79e03b84cd918"} Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.273070 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-td96h" event={"ID":"a015a221-3196-466d-b58f-79a0b04104ec","Type":"ContainerStarted","Data":"8bcd4a5c47c71ec6ee6cbb6a96b07c1f9ca263677db35f26c75978c3814cbda2"} Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.274804 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" event={"ID":"50c20406-8225-4008-a120-1e075514ef8d","Type":"ContainerStarted","Data":"e3a8b2dcf7cede3c40b1dd2e98609c1531d2968b43c15e4d5319f766ef3b10f0"} Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.275339 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.287194 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-ffkkn" podStartSLOduration=2.139784023 podStartE2EDuration="5.287174546s" podCreationTimestamp="2025-12-03 19:04:34 +0000 UTC" firstStartedPulling="2025-12-03 19:04:34.913132098 +0000 UTC m=+595.511726562" lastFinishedPulling="2025-12-03 19:04:38.060522621 +0000 UTC m=+598.659117085" observedRunningTime="2025-12-03 19:04:39.284862776 +0000 UTC m=+599.883457250" watchObservedRunningTime="2025-12-03 19:04:39.287174546 +0000 UTC m=+599.885769010" Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.303402 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-td96h" podStartSLOduration=2.00924873 podStartE2EDuration="5.303381323s" podCreationTimestamp="2025-12-03 19:04:34 +0000 UTC" firstStartedPulling="2025-12-03 19:04:34.765271323 +0000 UTC m=+595.363865787" lastFinishedPulling="2025-12-03 19:04:38.059403916 +0000 UTC m=+598.657998380" observedRunningTime="2025-12-03 19:04:39.301025751 +0000 UTC m=+599.899620235" watchObservedRunningTime="2025-12-03 19:04:39.303381323 +0000 UTC m=+599.901975787" Dec 03 19:04:39 crc kubenswrapper[4731]: I1203 19:04:39.322377 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" podStartSLOduration=2.222031651 podStartE2EDuration="5.322359214s" podCreationTimestamp="2025-12-03 19:04:34 +0000 UTC" firstStartedPulling="2025-12-03 19:04:35.018707028 +0000 UTC m=+595.617301502" lastFinishedPulling="2025-12-03 19:04:38.119034601 +0000 UTC m=+598.717629065" observedRunningTime="2025-12-03 19:04:39.318609168 +0000 UTC m=+599.917203642" watchObservedRunningTime="2025-12-03 19:04:39.322359214 +0000 UTC m=+599.920953678" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.484988 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vfp9" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.559645 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xcsvg"] Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560148 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-controller" containerID="cri-o://1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560335 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-acl-logging" containerID="cri-o://6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560305 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="sbdb" containerID="cri-o://a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560341 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="northd" containerID="cri-o://afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560350 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-node" containerID="cri-o://c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560476 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="nbdb" containerID="cri-o://f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.560592 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.626177 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" containerID="cri-o://14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" gracePeriod=30 Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.904967 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/2.log" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.907816 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovn-acl-logging/0.log" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.908507 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovn-controller/0.log" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.908991 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.967308 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lk28m"] Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.968792 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.968838 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.968850 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="nbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.968860 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="nbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.968869 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="sbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.968877 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="sbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.968889 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.968896 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969205 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969218 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969228 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969234 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969244 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="northd" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969268 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="northd" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969277 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969283 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969290 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-acl-logging" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969296 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-acl-logging" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969310 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kubecfg-setup" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969320 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kubecfg-setup" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969331 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-node" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969339 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-node" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969493 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-node" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969503 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969513 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969524 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969534 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969543 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969552 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969560 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="nbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969567 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="northd" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969575 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="sbdb" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969583 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovn-acl-logging" Dec 03 19:04:44 crc kubenswrapper[4731]: E1203 19:04:44.969678 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.969685 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" containerName="ovnkube-controller" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.971444 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981150 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981192 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981214 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981242 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981280 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981294 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981327 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981354 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981376 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981410 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981434 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981454 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981477 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981504 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981520 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981537 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981560 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981637 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjcq\" (UniqueName: \"kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981660 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981687 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib\") pod \"2676769f-27dd-4ac2-9398-7322817ce55a\" (UID: \"2676769f-27dd-4ac2-9398-7322817ce55a\") " Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.981967 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982017 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982041 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982060 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982430 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982562 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982617 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.982645 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983109 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983194 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983229 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983272 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash" (OuterVolumeSpecName: "host-slash") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983299 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983476 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log" (OuterVolumeSpecName: "node-log") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983552 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983809 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.983866 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket" (OuterVolumeSpecName: "log-socket") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.990936 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq" (OuterVolumeSpecName: "kube-api-access-wsjcq") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "kube-api-access-wsjcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.992561 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:04:44 crc kubenswrapper[4731]: I1203 19:04:44.998978 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2676769f-27dd-4ac2-9398-7322817ce55a" (UID: "2676769f-27dd-4ac2-9398-7322817ce55a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.082709 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.082804 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-env-overrides\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.082868 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-ovn\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.082903 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-netd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083060 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-node-log\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083126 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-kubelet\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083165 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083205 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-etc-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083252 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-slash\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083334 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-script-lib\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083365 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-netns\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083398 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5fe1fc5-5e59-4803-833c-609051caad71-ovn-node-metrics-cert\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083432 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-systemd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083475 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szr6z\" (UniqueName: \"kubernetes.io/projected/a5fe1fc5-5e59-4803-833c-609051caad71-kube-api-access-szr6z\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083574 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-systemd-units\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083620 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083659 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-log-socket\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083691 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-var-lib-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083727 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-bin\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083758 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-config\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083840 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjcq\" (UniqueName: \"kubernetes.io/projected/2676769f-27dd-4ac2-9398-7322817ce55a-kube-api-access-wsjcq\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083861 4731 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083882 4731 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083933 4731 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083955 4731 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083974 4731 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.083992 4731 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084009 4731 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084027 4731 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084044 4731 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084061 4731 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084079 4731 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084099 4731 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084117 4731 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2676769f-27dd-4ac2-9398-7322817ce55a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084134 4731 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084151 4731 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084169 4731 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2676769f-27dd-4ac2-9398-7322817ce55a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084186 4731 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084204 4731 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.084223 4731 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2676769f-27dd-4ac2-9398-7322817ce55a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.185759 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-env-overrides\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.185842 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-ovn\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.185888 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-netd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.185949 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-node-log\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.185985 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-kubelet\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186027 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186060 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-etc-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186082 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-ovn\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186138 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-node-log\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186175 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186206 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-kubelet\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186101 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-script-lib\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186232 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-netd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186274 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-slash\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186305 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-netns\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186318 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-etc-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186385 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-slash\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186435 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-netns\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186335 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5fe1fc5-5e59-4803-833c-609051caad71-ovn-node-metrics-cert\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186495 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-systemd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186537 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szr6z\" (UniqueName: \"kubernetes.io/projected/a5fe1fc5-5e59-4803-833c-609051caad71-kube-api-access-szr6z\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186593 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-systemd-units\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186620 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186625 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-run-systemd\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186646 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-log-socket\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186670 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-log-socket\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186691 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-systemd-units\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186700 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186731 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-var-lib-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186799 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-var-lib-openvswitch\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186808 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-config\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186858 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-bin\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.186951 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.187077 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-cni-bin\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.187138 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fe1fc5-5e59-4803-833c-609051caad71-host-run-ovn-kubernetes\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.187628 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-config\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.187701 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-env-overrides\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.189018 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5fe1fc5-5e59-4803-833c-609051caad71-ovnkube-script-lib\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.193471 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5fe1fc5-5e59-4803-833c-609051caad71-ovn-node-metrics-cert\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.206381 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szr6z\" (UniqueName: \"kubernetes.io/projected/a5fe1fc5-5e59-4803-833c-609051caad71-kube-api-access-szr6z\") pod \"ovnkube-node-lk28m\" (UID: \"a5fe1fc5-5e59-4803-833c-609051caad71\") " pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.290318 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.324787 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/1.log" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.325714 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/0.log" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.325819 4731 generic.go:334] "Generic (PLEG): container finished" podID="4ee4f887-8ce3-42c9-9886-06bdf109800c" containerID="99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0" exitCode=2 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.325941 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerDied","Data":"99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.325987 4731 scope.go:117] "RemoveContainer" containerID="200ff970d315f1a6f3d488c4ee0daf3e04b281e3012579c47b5264e3e5ac0887" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.326796 4731 scope.go:117] "RemoveContainer" containerID="99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.327180 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-x7zbk_openshift-multus(4ee4f887-8ce3-42c9-9886-06bdf109800c)\"" pod="openshift-multus/multus-x7zbk" podUID="4ee4f887-8ce3-42c9-9886-06bdf109800c" Dec 03 19:04:45 crc kubenswrapper[4731]: W1203 19:04:45.332359 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fe1fc5_5e59_4803_833c_609051caad71.slice/crio-bf96c87576b575dc02649d08d3b3019d96dbe6cd2a6b14fae06e077e95f7cf17 WatchSource:0}: Error finding container bf96c87576b575dc02649d08d3b3019d96dbe6cd2a6b14fae06e077e95f7cf17: Status 404 returned error can't find the container with id bf96c87576b575dc02649d08d3b3019d96dbe6cd2a6b14fae06e077e95f7cf17 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.334309 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovnkube-controller/2.log" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.342915 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovn-acl-logging/0.log" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.348901 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xcsvg_2676769f-27dd-4ac2-9398-7322817ce55a/ovn-controller/0.log" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350414 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350489 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350517 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350545 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350554 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350572 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350573 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350611 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350678 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" exitCode=0 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350704 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" exitCode=143 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350728 4731 generic.go:334] "Generic (PLEG): container finished" podID="2676769f-27dd-4ac2-9398-7322817ce55a" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" exitCode=143 Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350687 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350808 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350853 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350930 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350951 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350968 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.350984 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351005 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351021 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351038 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351057 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351076 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351101 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351130 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351149 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351164 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351180 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351195 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351214 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351231 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351247 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351322 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351339 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351362 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351388 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351406 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351504 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351529 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351550 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351569 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351585 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351601 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351618 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351691 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351720 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" event={"ID":"2676769f-27dd-4ac2-9398-7322817ce55a","Type":"ContainerDied","Data":"f96b492d102e1801cf1042e561926c64d5ff5475b4d9151bfd2bf249636e0f25"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351779 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351818 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351835 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351852 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351869 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351900 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351916 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351518 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xcsvg" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.351935 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.353139 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.353176 4731 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.389573 4731 scope.go:117] "RemoveContainer" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.414566 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xcsvg"] Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.417757 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xcsvg"] Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.423971 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.442382 4731 scope.go:117] "RemoveContainer" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.463875 4731 scope.go:117] "RemoveContainer" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.535118 4731 scope.go:117] "RemoveContainer" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.557481 4731 scope.go:117] "RemoveContainer" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.570963 4731 scope.go:117] "RemoveContainer" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.585427 4731 scope.go:117] "RemoveContainer" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.601789 4731 scope.go:117] "RemoveContainer" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.619461 4731 scope.go:117] "RemoveContainer" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.635242 4731 scope.go:117] "RemoveContainer" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.635702 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": container with ID starting with 14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba not found: ID does not exist" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.635755 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} err="failed to get container status \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": rpc error: code = NotFound desc = could not find container \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": container with ID starting with 14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.635788 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.636243 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": container with ID starting with 2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3 not found: ID does not exist" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636300 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} err="failed to get container status \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": rpc error: code = NotFound desc = could not find container \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": container with ID starting with 2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636329 4731 scope.go:117] "RemoveContainer" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.636572 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": container with ID starting with a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81 not found: ID does not exist" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636600 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} err="failed to get container status \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": rpc error: code = NotFound desc = could not find container \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": container with ID starting with a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636625 4731 scope.go:117] "RemoveContainer" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.636879 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": container with ID starting with f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6 not found: ID does not exist" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636907 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} err="failed to get container status \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": rpc error: code = NotFound desc = could not find container \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": container with ID starting with f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.636925 4731 scope.go:117] "RemoveContainer" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.637363 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": container with ID starting with afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f not found: ID does not exist" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.637391 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} err="failed to get container status \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": rpc error: code = NotFound desc = could not find container \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": container with ID starting with afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.637409 4731 scope.go:117] "RemoveContainer" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.637816 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": container with ID starting with b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4 not found: ID does not exist" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.637841 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} err="failed to get container status \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": rpc error: code = NotFound desc = could not find container \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": container with ID starting with b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.637859 4731 scope.go:117] "RemoveContainer" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.638370 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": container with ID starting with c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162 not found: ID does not exist" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.638401 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} err="failed to get container status \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": rpc error: code = NotFound desc = could not find container \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": container with ID starting with c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.638417 4731 scope.go:117] "RemoveContainer" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.638643 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": container with ID starting with 6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8 not found: ID does not exist" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.638669 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} err="failed to get container status \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": rpc error: code = NotFound desc = could not find container \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": container with ID starting with 6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.638687 4731 scope.go:117] "RemoveContainer" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.638976 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": container with ID starting with 1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3 not found: ID does not exist" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.638998 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} err="failed to get container status \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": rpc error: code = NotFound desc = could not find container \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": container with ID starting with 1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.639011 4731 scope.go:117] "RemoveContainer" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: E1203 19:04:45.639310 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": container with ID starting with 9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6 not found: ID does not exist" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.639334 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} err="failed to get container status \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": rpc error: code = NotFound desc = could not find container \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": container with ID starting with 9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.639351 4731 scope.go:117] "RemoveContainer" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.639660 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} err="failed to get container status \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": rpc error: code = NotFound desc = could not find container \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": container with ID starting with 14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.639680 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640075 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} err="failed to get container status \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": rpc error: code = NotFound desc = could not find container \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": container with ID starting with 2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640093 4731 scope.go:117] "RemoveContainer" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640356 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} err="failed to get container status \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": rpc error: code = NotFound desc = could not find container \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": container with ID starting with a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640383 4731 scope.go:117] "RemoveContainer" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640692 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} err="failed to get container status \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": rpc error: code = NotFound desc = could not find container \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": container with ID starting with f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640712 4731 scope.go:117] "RemoveContainer" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640962 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} err="failed to get container status \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": rpc error: code = NotFound desc = could not find container \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": container with ID starting with afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.640979 4731 scope.go:117] "RemoveContainer" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641228 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} err="failed to get container status \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": rpc error: code = NotFound desc = could not find container \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": container with ID starting with b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641274 4731 scope.go:117] "RemoveContainer" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641494 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} err="failed to get container status \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": rpc error: code = NotFound desc = could not find container \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": container with ID starting with c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641519 4731 scope.go:117] "RemoveContainer" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641767 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} err="failed to get container status \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": rpc error: code = NotFound desc = could not find container \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": container with ID starting with 6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.641786 4731 scope.go:117] "RemoveContainer" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642025 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} err="failed to get container status \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": rpc error: code = NotFound desc = could not find container \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": container with ID starting with 1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642057 4731 scope.go:117] "RemoveContainer" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642300 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} err="failed to get container status \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": rpc error: code = NotFound desc = could not find container \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": container with ID starting with 9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642328 4731 scope.go:117] "RemoveContainer" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642549 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} err="failed to get container status \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": rpc error: code = NotFound desc = could not find container \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": container with ID starting with 14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642578 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642796 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} err="failed to get container status \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": rpc error: code = NotFound desc = could not find container \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": container with ID starting with 2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.642816 4731 scope.go:117] "RemoveContainer" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643111 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} err="failed to get container status \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": rpc error: code = NotFound desc = could not find container \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": container with ID starting with a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643128 4731 scope.go:117] "RemoveContainer" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643593 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} err="failed to get container status \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": rpc error: code = NotFound desc = could not find container \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": container with ID starting with f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643613 4731 scope.go:117] "RemoveContainer" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643885 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} err="failed to get container status \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": rpc error: code = NotFound desc = could not find container \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": container with ID starting with afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.643903 4731 scope.go:117] "RemoveContainer" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644125 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} err="failed to get container status \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": rpc error: code = NotFound desc = could not find container \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": container with ID starting with b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644142 4731 scope.go:117] "RemoveContainer" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644419 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} err="failed to get container status \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": rpc error: code = NotFound desc = could not find container \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": container with ID starting with c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644437 4731 scope.go:117] "RemoveContainer" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644740 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} err="failed to get container status \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": rpc error: code = NotFound desc = could not find container \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": container with ID starting with 6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.644765 4731 scope.go:117] "RemoveContainer" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645026 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} err="failed to get container status \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": rpc error: code = NotFound desc = could not find container \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": container with ID starting with 1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645045 4731 scope.go:117] "RemoveContainer" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645332 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} err="failed to get container status \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": rpc error: code = NotFound desc = could not find container \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": container with ID starting with 9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645353 4731 scope.go:117] "RemoveContainer" containerID="14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645711 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba"} err="failed to get container status \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": rpc error: code = NotFound desc = could not find container \"14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba\": container with ID starting with 14474ed39a964f08e78a4c21f7eebf5b95e8b51c455e6643155c5d71b06c29ba not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645731 4731 scope.go:117] "RemoveContainer" containerID="2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645977 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3"} err="failed to get container status \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": rpc error: code = NotFound desc = could not find container \"2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3\": container with ID starting with 2a7bba0aa3056c760d2977b3184a85d3cde87153572f6922e43a46b00ead1cf3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.645997 4731 scope.go:117] "RemoveContainer" containerID="a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646239 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81"} err="failed to get container status \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": rpc error: code = NotFound desc = could not find container \"a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81\": container with ID starting with a097aa9bba610c0cf3227e8532849acdb801dee1551ceca0f00628fa8f486a81 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646289 4731 scope.go:117] "RemoveContainer" containerID="f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646594 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6"} err="failed to get container status \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": rpc error: code = NotFound desc = could not find container \"f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6\": container with ID starting with f335efe583e98788ad6fcf20ed98cd53bd521ea0d3f402620421a5335a33b4d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646614 4731 scope.go:117] "RemoveContainer" containerID="afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646906 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f"} err="failed to get container status \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": rpc error: code = NotFound desc = could not find container \"afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f\": container with ID starting with afed75fe2002c4835c0cbf7dd26407f96128252be73fd7cd7af2ead2a09cd12f not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.646924 4731 scope.go:117] "RemoveContainer" containerID="b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647206 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4"} err="failed to get container status \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": rpc error: code = NotFound desc = could not find container \"b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4\": container with ID starting with b6c810d9701a6b00895925b846126e032ac1458b544aef770c3a23cf2fb06fb4 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647224 4731 scope.go:117] "RemoveContainer" containerID="c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647519 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162"} err="failed to get container status \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": rpc error: code = NotFound desc = could not find container \"c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162\": container with ID starting with c22dfee89b081953abbed214e20386df42c38af2f16f0a747cf5ec178e273162 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647541 4731 scope.go:117] "RemoveContainer" containerID="6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647795 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8"} err="failed to get container status \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": rpc error: code = NotFound desc = could not find container \"6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8\": container with ID starting with 6e13dd0623330f1b843e7fd125a1243e42c3ace971a660674387e81c9ee973d8 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.647814 4731 scope.go:117] "RemoveContainer" containerID="1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.648101 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3"} err="failed to get container status \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": rpc error: code = NotFound desc = could not find container \"1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3\": container with ID starting with 1493cbcc09be074b64160b2138a39d54e9c7125a2d1783ce9869e4e01ad1e2f3 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.648130 4731 scope.go:117] "RemoveContainer" containerID="9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.648444 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6"} err="failed to get container status \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": rpc error: code = NotFound desc = could not find container \"9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6\": container with ID starting with 9870edf6e8a90fac3673aa46385c5947ec489afc4db0f6041add423b678833d6 not found: ID does not exist" Dec 03 19:04:45 crc kubenswrapper[4731]: I1203 19:04:45.865515 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2676769f-27dd-4ac2-9398-7322817ce55a" path="/var/lib/kubelet/pods/2676769f-27dd-4ac2-9398-7322817ce55a/volumes" Dec 03 19:04:46 crc kubenswrapper[4731]: I1203 19:04:46.357365 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/1.log" Dec 03 19:04:46 crc kubenswrapper[4731]: I1203 19:04:46.359904 4731 generic.go:334] "Generic (PLEG): container finished" podID="a5fe1fc5-5e59-4803-833c-609051caad71" containerID="150f30f6bdaedf7677ab4577fb0e304ba7064c09e03f231cfc819074df45a2b8" exitCode=0 Dec 03 19:04:46 crc kubenswrapper[4731]: I1203 19:04:46.359941 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerDied","Data":"150f30f6bdaedf7677ab4577fb0e304ba7064c09e03f231cfc819074df45a2b8"} Dec 03 19:04:46 crc kubenswrapper[4731]: I1203 19:04:46.359964 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"bf96c87576b575dc02649d08d3b3019d96dbe6cd2a6b14fae06e077e95f7cf17"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.369791 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"77e79d4536370762b3a049efa50da55e2ec9d7a90595567d24d1176e05da2f7a"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.370387 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"3d619fd74f1aaf269e81bf8b3bebc96ff072a93e044e9f84317944d0555a25c7"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.370400 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"00358f6af677ca5e0d7d78a97735fedfb7f1a14db9667f97da82a4c24a6a4b49"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.370411 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"0370d2b814ec727987cacc613161c7c6007786cda55a6e821d7e3d63c2d34bf6"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.370421 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"f9f7d11428d9aebfad625f0fca7f03f424b8c1410699a19c464569991f548443"} Dec 03 19:04:47 crc kubenswrapper[4731]: I1203 19:04:47.370430 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"49724dd9b98dcb6081a4c685d46d65838cd7e1d6fc14f39c87dbc0dd21f6f527"} Dec 03 19:04:49 crc kubenswrapper[4731]: I1203 19:04:49.386888 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"c1a38b1bbdf4d8a0a11f1143527beb37920a617d7705010d469072cb5b18b0e0"} Dec 03 19:04:52 crc kubenswrapper[4731]: I1203 19:04:52.407138 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" event={"ID":"a5fe1fc5-5e59-4803-833c-609051caad71","Type":"ContainerStarted","Data":"b838b44a36e7b8257dd4a7cc0f929ab401ca01dce6b634325ab5bed1963f501e"} Dec 03 19:04:52 crc kubenswrapper[4731]: I1203 19:04:52.407687 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:52 crc kubenswrapper[4731]: I1203 19:04:52.407703 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:52 crc kubenswrapper[4731]: I1203 19:04:52.442930 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:52 crc kubenswrapper[4731]: I1203 19:04:52.443857 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" podStartSLOduration=8.443847591 podStartE2EDuration="8.443847591s" podCreationTimestamp="2025-12-03 19:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:04:52.441559532 +0000 UTC m=+613.040154026" watchObservedRunningTime="2025-12-03 19:04:52.443847591 +0000 UTC m=+613.042442055" Dec 03 19:04:53 crc kubenswrapper[4731]: I1203 19:04:53.412960 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:53 crc kubenswrapper[4731]: I1203 19:04:53.443542 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:04:57 crc kubenswrapper[4731]: I1203 19:04:57.856328 4731 scope.go:117] "RemoveContainer" containerID="99de44708f66eff904f8b59f32bc817fdce391655d10466f27829884f1d19bf0" Dec 03 19:04:58 crc kubenswrapper[4731]: I1203 19:04:58.461520 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7zbk_4ee4f887-8ce3-42c9-9886-06bdf109800c/kube-multus/1.log" Dec 03 19:04:58 crc kubenswrapper[4731]: I1203 19:04:58.461956 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7zbk" event={"ID":"4ee4f887-8ce3-42c9-9886-06bdf109800c","Type":"ContainerStarted","Data":"ec16ed3e4db621fb081e0a3de3eefaee8452340a8048832da363cbda8e09ff15"} Dec 03 19:05:15 crc kubenswrapper[4731]: I1203 19:05:15.319616 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lk28m" Dec 03 19:05:25 crc kubenswrapper[4731]: I1203 19:05:25.939313 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj"] Dec 03 19:05:25 crc kubenswrapper[4731]: I1203 19:05:25.941193 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:25 crc kubenswrapper[4731]: I1203 19:05:25.943412 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 19:05:25 crc kubenswrapper[4731]: I1203 19:05:25.954990 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj"] Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.077385 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc99f\" (UniqueName: \"kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.077467 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.077521 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.178410 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.178481 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.178539 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc99f\" (UniqueName: \"kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.179061 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.179076 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.203018 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc99f\" (UniqueName: \"kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.256921 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.568973 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj"] Dec 03 19:05:26 crc kubenswrapper[4731]: I1203 19:05:26.648501 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" event={"ID":"0a940e33-2b46-4dd2-9df7-94c8217c5969","Type":"ContainerStarted","Data":"6122baa778ebffe49c29c70e35a68e2127d57a49ab6d54b5a852a04366a97e08"} Dec 03 19:05:27 crc kubenswrapper[4731]: I1203 19:05:27.655732 4731 generic.go:334] "Generic (PLEG): container finished" podID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerID="8ebabbce0c98a8b67d00cc58ad6fc1e26cc2b14a4711a92093aff0d59ae1bd6b" exitCode=0 Dec 03 19:05:27 crc kubenswrapper[4731]: I1203 19:05:27.656154 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" event={"ID":"0a940e33-2b46-4dd2-9df7-94c8217c5969","Type":"ContainerDied","Data":"8ebabbce0c98a8b67d00cc58ad6fc1e26cc2b14a4711a92093aff0d59ae1bd6b"} Dec 03 19:05:29 crc kubenswrapper[4731]: I1203 19:05:29.675854 4731 generic.go:334] "Generic (PLEG): container finished" podID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerID="22ec6c6413645921f444570317e6bf9eb208fb3d2b3b965f566a5f20d95f7c4b" exitCode=0 Dec 03 19:05:29 crc kubenswrapper[4731]: I1203 19:05:29.675985 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" event={"ID":"0a940e33-2b46-4dd2-9df7-94c8217c5969","Type":"ContainerDied","Data":"22ec6c6413645921f444570317e6bf9eb208fb3d2b3b965f566a5f20d95f7c4b"} Dec 03 19:05:30 crc kubenswrapper[4731]: I1203 19:05:30.689307 4731 generic.go:334] "Generic (PLEG): container finished" podID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerID="5aa2c087c2d4819e2e119eea50826364a55eee9c3c918366fe4063bc196dc5b4" exitCode=0 Dec 03 19:05:30 crc kubenswrapper[4731]: I1203 19:05:30.689361 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" event={"ID":"0a940e33-2b46-4dd2-9df7-94c8217c5969","Type":"ContainerDied","Data":"5aa2c087c2d4819e2e119eea50826364a55eee9c3c918366fe4063bc196dc5b4"} Dec 03 19:05:31 crc kubenswrapper[4731]: I1203 19:05:31.995335 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.063614 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc99f\" (UniqueName: \"kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f\") pod \"0a940e33-2b46-4dd2-9df7-94c8217c5969\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.063765 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util\") pod \"0a940e33-2b46-4dd2-9df7-94c8217c5969\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.063882 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle\") pod \"0a940e33-2b46-4dd2-9df7-94c8217c5969\" (UID: \"0a940e33-2b46-4dd2-9df7-94c8217c5969\") " Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.064809 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle" (OuterVolumeSpecName: "bundle") pod "0a940e33-2b46-4dd2-9df7-94c8217c5969" (UID: "0a940e33-2b46-4dd2-9df7-94c8217c5969"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.074551 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f" (OuterVolumeSpecName: "kube-api-access-zc99f") pod "0a940e33-2b46-4dd2-9df7-94c8217c5969" (UID: "0a940e33-2b46-4dd2-9df7-94c8217c5969"). InnerVolumeSpecName "kube-api-access-zc99f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.077536 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util" (OuterVolumeSpecName: "util") pod "0a940e33-2b46-4dd2-9df7-94c8217c5969" (UID: "0a940e33-2b46-4dd2-9df7-94c8217c5969"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.166161 4731 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.166234 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc99f\" (UniqueName: \"kubernetes.io/projected/0a940e33-2b46-4dd2-9df7-94c8217c5969-kube-api-access-zc99f\") on node \"crc\" DevicePath \"\"" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.166293 4731 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a940e33-2b46-4dd2-9df7-94c8217c5969-util\") on node \"crc\" DevicePath \"\"" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.708377 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" event={"ID":"0a940e33-2b46-4dd2-9df7-94c8217c5969","Type":"ContainerDied","Data":"6122baa778ebffe49c29c70e35a68e2127d57a49ab6d54b5a852a04366a97e08"} Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.708714 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6122baa778ebffe49c29c70e35a68e2127d57a49ab6d54b5a852a04366a97e08" Dec 03 19:05:32 crc kubenswrapper[4731]: I1203 19:05:32.708632 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.551803 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg"] Dec 03 19:05:37 crc kubenswrapper[4731]: E1203 19:05:37.552733 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="pull" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.552751 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="pull" Dec 03 19:05:37 crc kubenswrapper[4731]: E1203 19:05:37.552772 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="util" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.552780 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="util" Dec 03 19:05:37 crc kubenswrapper[4731]: E1203 19:05:37.552787 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="extract" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.552797 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="extract" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.553095 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a940e33-2b46-4dd2-9df7-94c8217c5969" containerName="extract" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.555484 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.563559 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.563911 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.569843 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ggpmx" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.577589 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg"] Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.656869 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5wwv\" (UniqueName: \"kubernetes.io/projected/75cd2a87-eba6-4d06-a66c-3740a62f7496-kube-api-access-l5wwv\") pod \"nmstate-operator-5b5b58f5c8-mg7cg\" (UID: \"75cd2a87-eba6-4d06-a66c-3740a62f7496\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.758809 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5wwv\" (UniqueName: \"kubernetes.io/projected/75cd2a87-eba6-4d06-a66c-3740a62f7496-kube-api-access-l5wwv\") pod \"nmstate-operator-5b5b58f5c8-mg7cg\" (UID: \"75cd2a87-eba6-4d06-a66c-3740a62f7496\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.784615 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5wwv\" (UniqueName: \"kubernetes.io/projected/75cd2a87-eba6-4d06-a66c-3740a62f7496-kube-api-access-l5wwv\") pod \"nmstate-operator-5b5b58f5c8-mg7cg\" (UID: \"75cd2a87-eba6-4d06-a66c-3740a62f7496\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" Dec 03 19:05:37 crc kubenswrapper[4731]: I1203 19:05:37.880473 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" Dec 03 19:05:38 crc kubenswrapper[4731]: I1203 19:05:38.098233 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg"] Dec 03 19:05:38 crc kubenswrapper[4731]: W1203 19:05:38.107767 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cd2a87_eba6_4d06_a66c_3740a62f7496.slice/crio-4aa45ec29de1a9512c0549e9201ffc0d6c4dfa1ae8975be928d5381db773c49d WatchSource:0}: Error finding container 4aa45ec29de1a9512c0549e9201ffc0d6c4dfa1ae8975be928d5381db773c49d: Status 404 returned error can't find the container with id 4aa45ec29de1a9512c0549e9201ffc0d6c4dfa1ae8975be928d5381db773c49d Dec 03 19:05:39 crc kubenswrapper[4731]: I1203 19:05:39.135125 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" event={"ID":"75cd2a87-eba6-4d06-a66c-3740a62f7496","Type":"ContainerStarted","Data":"4aa45ec29de1a9512c0549e9201ffc0d6c4dfa1ae8975be928d5381db773c49d"} Dec 03 19:05:41 crc kubenswrapper[4731]: I1203 19:05:41.148942 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" event={"ID":"75cd2a87-eba6-4d06-a66c-3740a62f7496","Type":"ContainerStarted","Data":"597c2d26ea38af3d4a51257cc7b471f9cf2c812845c35d44ccd8f5a5b77f708d"} Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.357395 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mg7cg" podStartSLOduration=7.234001432 podStartE2EDuration="9.357377959s" podCreationTimestamp="2025-12-03 19:05:37 +0000 UTC" firstStartedPulling="2025-12-03 19:05:38.109990305 +0000 UTC m=+658.708584769" lastFinishedPulling="2025-12-03 19:05:40.233366832 +0000 UTC m=+660.831961296" observedRunningTime="2025-12-03 19:05:41.169658864 +0000 UTC m=+661.768253348" watchObservedRunningTime="2025-12-03 19:05:46.357377959 +0000 UTC m=+666.955972423" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.358247 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.359102 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.360948 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8996g" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.378633 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.392339 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.393372 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.395590 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.420381 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-g4tbf"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.424368 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.424475 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527182 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlcxl\" (UniqueName: \"kubernetes.io/projected/85c23854-5afa-4083-acdb-da40a631204b-kube-api-access-nlcxl\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527269 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527304 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4zp\" (UniqueName: \"kubernetes.io/projected/950127c6-7145-4075-9956-2922dcfb6d9a-kube-api-access-6c4zp\") pod \"nmstate-metrics-7f946cbc9-9qx9k\" (UID: \"950127c6-7145-4075-9956-2922dcfb6d9a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527344 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-ovs-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527368 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbtm\" (UniqueName: \"kubernetes.io/projected/52d46f33-0c4f-403f-a207-8ebb320e6c0d-kube-api-access-8kbtm\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527405 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-dbus-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527436 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-nmstate-lock\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.527577 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.528233 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.530413 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.530601 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.530437 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7jtth" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.542597 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.628861 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-nmstate-lock\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.628924 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/afac9508-23d7-4b28-a52b-c6bf555cc02a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.628969 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlcxl\" (UniqueName: \"kubernetes.io/projected/85c23854-5afa-4083-acdb-da40a631204b-kube-api-access-nlcxl\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629000 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629032 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rssmg\" (UniqueName: \"kubernetes.io/projected/afac9508-23d7-4b28-a52b-c6bf555cc02a-kube-api-access-rssmg\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629057 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4zp\" (UniqueName: \"kubernetes.io/projected/950127c6-7145-4075-9956-2922dcfb6d9a-kube-api-access-6c4zp\") pod \"nmstate-metrics-7f946cbc9-9qx9k\" (UID: \"950127c6-7145-4075-9956-2922dcfb6d9a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629078 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/afac9508-23d7-4b28-a52b-c6bf555cc02a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629123 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-ovs-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629145 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbtm\" (UniqueName: \"kubernetes.io/projected/52d46f33-0c4f-403f-a207-8ebb320e6c0d-kube-api-access-8kbtm\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629184 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-dbus-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629484 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-dbus-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629537 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-nmstate-lock\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.629649 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/85c23854-5afa-4083-acdb-da40a631204b-ovs-socket\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: E1203 19:05:46.629803 4731 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 03 19:05:46 crc kubenswrapper[4731]: E1203 19:05:46.629933 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair podName:52d46f33-0c4f-403f-a207-8ebb320e6c0d nodeName:}" failed. No retries permitted until 2025-12-03 19:05:47.129909979 +0000 UTC m=+667.728504453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-gpff6" (UID: "52d46f33-0c4f-403f-a207-8ebb320e6c0d") : secret "openshift-nmstate-webhook" not found Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.654673 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlcxl\" (UniqueName: \"kubernetes.io/projected/85c23854-5afa-4083-acdb-da40a631204b-kube-api-access-nlcxl\") pod \"nmstate-handler-g4tbf\" (UID: \"85c23854-5afa-4083-acdb-da40a631204b\") " pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.658318 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4zp\" (UniqueName: \"kubernetes.io/projected/950127c6-7145-4075-9956-2922dcfb6d9a-kube-api-access-6c4zp\") pod \"nmstate-metrics-7f946cbc9-9qx9k\" (UID: \"950127c6-7145-4075-9956-2922dcfb6d9a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.663869 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbtm\" (UniqueName: \"kubernetes.io/projected/52d46f33-0c4f-403f-a207-8ebb320e6c0d-kube-api-access-8kbtm\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.678999 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.729998 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/afac9508-23d7-4b28-a52b-c6bf555cc02a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.730450 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rssmg\" (UniqueName: \"kubernetes.io/projected/afac9508-23d7-4b28-a52b-c6bf555cc02a-kube-api-access-rssmg\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.730632 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/afac9508-23d7-4b28-a52b-c6bf555cc02a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.732034 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/afac9508-23d7-4b28-a52b-c6bf555cc02a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.739572 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/afac9508-23d7-4b28-a52b-c6bf555cc02a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.754185 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.770989 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5964fc6d77-whgh2"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.771661 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.771816 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rssmg\" (UniqueName: \"kubernetes.io/projected/afac9508-23d7-4b28-a52b-c6bf555cc02a-kube-api-access-rssmg\") pod \"nmstate-console-plugin-7fbb5f6569-7cf69\" (UID: \"afac9508-23d7-4b28-a52b-c6bf555cc02a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.797938 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5964fc6d77-whgh2"] Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.841604 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.935282 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-oauth-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.935617 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-trusted-ca-bundle\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.935782 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-oauth-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.935853 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-service-ca\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.935943 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.936020 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8pv\" (UniqueName: \"kubernetes.io/projected/884cfbf1-4046-4854-8f12-f4fe95d29e65-kube-api-access-9f8pv\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:46 crc kubenswrapper[4731]: I1203 19:05:46.936083 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:46.998384 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k"] Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037056 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-oauth-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037122 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-service-ca\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037152 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037178 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8pv\" (UniqueName: \"kubernetes.io/projected/884cfbf1-4046-4854-8f12-f4fe95d29e65-kube-api-access-9f8pv\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037240 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037312 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-oauth-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.037338 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-trusted-ca-bundle\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.038833 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-service-ca\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.039053 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.039389 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-oauth-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.039652 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/884cfbf1-4046-4854-8f12-f4fe95d29e65-trusted-ca-bundle\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.044173 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-serving-cert\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.044543 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/884cfbf1-4046-4854-8f12-f4fe95d29e65-console-oauth-config\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.056628 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8pv\" (UniqueName: \"kubernetes.io/projected/884cfbf1-4046-4854-8f12-f4fe95d29e65-kube-api-access-9f8pv\") pod \"console-5964fc6d77-whgh2\" (UID: \"884cfbf1-4046-4854-8f12-f4fe95d29e65\") " pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.061041 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69"] Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.105470 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.138966 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.142439 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/52d46f33-0c4f-403f-a207-8ebb320e6c0d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gpff6\" (UID: \"52d46f33-0c4f-403f-a207-8ebb320e6c0d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.183994 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" event={"ID":"afac9508-23d7-4b28-a52b-c6bf555cc02a","Type":"ContainerStarted","Data":"ded26eed3a259f71b7e0905469bf7bd7dadc6a8da9b6caea5814b0310cb008e1"} Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.186167 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-g4tbf" event={"ID":"85c23854-5afa-4083-acdb-da40a631204b","Type":"ContainerStarted","Data":"1597d0d26cd29046ff3bd31cee7cc21a2c9b70c81e473be5b99e7430df8f4ea6"} Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.187418 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" event={"ID":"950127c6-7145-4075-9956-2922dcfb6d9a","Type":"ContainerStarted","Data":"b121defe4a07f88c6096814db0f54c932ba6be2ab473dc205270cfaad90ece43"} Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.271466 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5964fc6d77-whgh2"] Dec 03 19:05:47 crc kubenswrapper[4731]: W1203 19:05:47.275714 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884cfbf1_4046_4854_8f12_f4fe95d29e65.slice/crio-46604d2f1d6fbc5507cdc52185e452d512cb727bf65a404dbcf2b87232fe1d56 WatchSource:0}: Error finding container 46604d2f1d6fbc5507cdc52185e452d512cb727bf65a404dbcf2b87232fe1d56: Status 404 returned error can't find the container with id 46604d2f1d6fbc5507cdc52185e452d512cb727bf65a404dbcf2b87232fe1d56 Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.311165 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:47 crc kubenswrapper[4731]: I1203 19:05:47.502361 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6"] Dec 03 19:05:47 crc kubenswrapper[4731]: W1203 19:05:47.504964 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d46f33_0c4f_403f_a207_8ebb320e6c0d.slice/crio-30ebaea2c83e18185df5fdc6858b6986ad454e62f5c10b6b3d5a716a45b99d48 WatchSource:0}: Error finding container 30ebaea2c83e18185df5fdc6858b6986ad454e62f5c10b6b3d5a716a45b99d48: Status 404 returned error can't find the container with id 30ebaea2c83e18185df5fdc6858b6986ad454e62f5c10b6b3d5a716a45b99d48 Dec 03 19:05:48 crc kubenswrapper[4731]: I1203 19:05:48.195068 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" event={"ID":"52d46f33-0c4f-403f-a207-8ebb320e6c0d","Type":"ContainerStarted","Data":"30ebaea2c83e18185df5fdc6858b6986ad454e62f5c10b6b3d5a716a45b99d48"} Dec 03 19:05:48 crc kubenswrapper[4731]: I1203 19:05:48.196709 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5964fc6d77-whgh2" event={"ID":"884cfbf1-4046-4854-8f12-f4fe95d29e65","Type":"ContainerStarted","Data":"7ebef3c9e02a1b8fd4bb83b929fadc23c6f5f4f96e948d142bd6344632f6d51d"} Dec 03 19:05:48 crc kubenswrapper[4731]: I1203 19:05:48.196755 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5964fc6d77-whgh2" event={"ID":"884cfbf1-4046-4854-8f12-f4fe95d29e65","Type":"ContainerStarted","Data":"46604d2f1d6fbc5507cdc52185e452d512cb727bf65a404dbcf2b87232fe1d56"} Dec 03 19:05:48 crc kubenswrapper[4731]: I1203 19:05:48.222161 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5964fc6d77-whgh2" podStartSLOduration=2.222141067 podStartE2EDuration="2.222141067s" podCreationTimestamp="2025-12-03 19:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:05:48.217204741 +0000 UTC m=+668.815799225" watchObservedRunningTime="2025-12-03 19:05:48.222141067 +0000 UTC m=+668.820735541" Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.213831 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" event={"ID":"52d46f33-0c4f-403f-a207-8ebb320e6c0d","Type":"ContainerStarted","Data":"009ddf3ed7b265202fd94562ad4dfd4594cf877de1a0a334af3fddccf59230b9"} Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.214180 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.215846 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" event={"ID":"950127c6-7145-4075-9956-2922dcfb6d9a","Type":"ContainerStarted","Data":"14bb6989ec0d76a92eed79b048e7ddad38f3b196b4ef86537f265e6f87cf94d4"} Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.217240 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" event={"ID":"afac9508-23d7-4b28-a52b-c6bf555cc02a","Type":"ContainerStarted","Data":"5f4251ec06fc9a1b7a8840dcbd91487f3c109964afd3df3eb99529d54a69453b"} Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.221175 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-g4tbf" event={"ID":"85c23854-5afa-4083-acdb-da40a631204b","Type":"ContainerStarted","Data":"883bc27972e5869dc40b2603287486a7070318c55d03f761a872c644205f3f54"} Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.221299 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.229826 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" podStartSLOduration=1.791452246 podStartE2EDuration="4.229809155s" podCreationTimestamp="2025-12-03 19:05:46 +0000 UTC" firstStartedPulling="2025-12-03 19:05:47.507942949 +0000 UTC m=+668.106537413" lastFinishedPulling="2025-12-03 19:05:49.946299848 +0000 UTC m=+670.544894322" observedRunningTime="2025-12-03 19:05:50.229303029 +0000 UTC m=+670.827897493" watchObservedRunningTime="2025-12-03 19:05:50.229809155 +0000 UTC m=+670.828403619" Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.251411 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-g4tbf" podStartSLOduration=1.141732077 podStartE2EDuration="4.251395347s" podCreationTimestamp="2025-12-03 19:05:46 +0000 UTC" firstStartedPulling="2025-12-03 19:05:46.817571585 +0000 UTC m=+667.416166049" lastFinishedPulling="2025-12-03 19:05:49.927234855 +0000 UTC m=+670.525829319" observedRunningTime="2025-12-03 19:05:50.249142516 +0000 UTC m=+670.847737000" watchObservedRunningTime="2025-12-03 19:05:50.251395347 +0000 UTC m=+670.849989811" Dec 03 19:05:50 crc kubenswrapper[4731]: I1203 19:05:50.277134 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-7cf69" podStartSLOduration=1.424127988 podStartE2EDuration="4.277114921s" podCreationTimestamp="2025-12-03 19:05:46 +0000 UTC" firstStartedPulling="2025-12-03 19:05:47.066872419 +0000 UTC m=+667.665466883" lastFinishedPulling="2025-12-03 19:05:49.919859342 +0000 UTC m=+670.518453816" observedRunningTime="2025-12-03 19:05:50.275416657 +0000 UTC m=+670.874011131" watchObservedRunningTime="2025-12-03 19:05:50.277114921 +0000 UTC m=+670.875709385" Dec 03 19:05:53 crc kubenswrapper[4731]: I1203 19:05:53.243033 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" event={"ID":"950127c6-7145-4075-9956-2922dcfb6d9a","Type":"ContainerStarted","Data":"f0b1de85ab5494e3679e941e2c8d1ef60053f48d01f2c2ee89fc086121c9e6e4"} Dec 03 19:05:56 crc kubenswrapper[4731]: I1203 19:05:56.791843 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-g4tbf" Dec 03 19:05:56 crc kubenswrapper[4731]: I1203 19:05:56.814374 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9qx9k" podStartSLOduration=5.606455086 podStartE2EDuration="10.814354879s" podCreationTimestamp="2025-12-03 19:05:46 +0000 UTC" firstStartedPulling="2025-12-03 19:05:47.015142453 +0000 UTC m=+667.613736917" lastFinishedPulling="2025-12-03 19:05:52.223042246 +0000 UTC m=+672.821636710" observedRunningTime="2025-12-03 19:05:53.275509913 +0000 UTC m=+673.874104417" watchObservedRunningTime="2025-12-03 19:05:56.814354879 +0000 UTC m=+677.412949353" Dec 03 19:05:57 crc kubenswrapper[4731]: I1203 19:05:57.105915 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:57 crc kubenswrapper[4731]: I1203 19:05:57.105966 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:57 crc kubenswrapper[4731]: I1203 19:05:57.110462 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:57 crc kubenswrapper[4731]: I1203 19:05:57.280496 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5964fc6d77-whgh2" Dec 03 19:05:57 crc kubenswrapper[4731]: I1203 19:05:57.342213 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 19:06:07 crc kubenswrapper[4731]: I1203 19:06:07.319562 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gpff6" Dec 03 19:06:22 crc kubenswrapper[4731]: I1203 19:06:22.389161 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-65n9c" podUID="3033d290-d147-4727-8d61-0dabed08e76d" containerName="console" containerID="cri-o://17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9" gracePeriod=15 Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.307644 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-65n9c_3033d290-d147-4727-8d61-0dabed08e76d/console/0.log" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.308224 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411002 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411114 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411155 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411230 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411316 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jnk4\" (UniqueName: \"kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411338 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.411364 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert\") pod \"3033d290-d147-4727-8d61-0dabed08e76d\" (UID: \"3033d290-d147-4727-8d61-0dabed08e76d\") " Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.412759 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.413033 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca" (OuterVolumeSpecName: "service-ca") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.413198 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.413405 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config" (OuterVolumeSpecName: "console-config") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.417485 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.417676 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.418626 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4" (OuterVolumeSpecName: "kube-api-access-9jnk4") pod "3033d290-d147-4727-8d61-0dabed08e76d" (UID: "3033d290-d147-4727-8d61-0dabed08e76d"). InnerVolumeSpecName "kube-api-access-9jnk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447340 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-65n9c_3033d290-d147-4727-8d61-0dabed08e76d/console/0.log" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447387 4731 generic.go:334] "Generic (PLEG): container finished" podID="3033d290-d147-4727-8d61-0dabed08e76d" containerID="17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9" exitCode=2 Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447420 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65n9c" event={"ID":"3033d290-d147-4727-8d61-0dabed08e76d","Type":"ContainerDied","Data":"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9"} Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447447 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65n9c" event={"ID":"3033d290-d147-4727-8d61-0dabed08e76d","Type":"ContainerDied","Data":"4231724b37da979aa08e9c07bf3a38a430e943c6d1a7f6ee2b3d8a8c4a159550"} Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447464 4731 scope.go:117] "RemoveContainer" containerID="17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.447569 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65n9c" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.479825 4731 scope.go:117] "RemoveContainer" containerID="17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9" Dec 03 19:06:23 crc kubenswrapper[4731]: E1203 19:06:23.480425 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9\": container with ID starting with 17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9 not found: ID does not exist" containerID="17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.480474 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9"} err="failed to get container status \"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9\": rpc error: code = NotFound desc = could not find container \"17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9\": container with ID starting with 17c354d24af62d1bf4bc0db9e8de36130c100612e09f5f524a9ea25400f842a9 not found: ID does not exist" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.482351 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.487058 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-65n9c"] Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513309 4731 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513347 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jnk4\" (UniqueName: \"kubernetes.io/projected/3033d290-d147-4727-8d61-0dabed08e76d-kube-api-access-9jnk4\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513361 4731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513373 4731 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513384 4731 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3033d290-d147-4727-8d61-0dabed08e76d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513395 4731 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.513405 4731 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3033d290-d147-4727-8d61-0dabed08e76d-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:23 crc kubenswrapper[4731]: I1203 19:06:23.883185 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3033d290-d147-4727-8d61-0dabed08e76d" path="/var/lib/kubelet/pods/3033d290-d147-4727-8d61-0dabed08e76d/volumes" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.354038 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5"] Dec 03 19:06:24 crc kubenswrapper[4731]: E1203 19:06:24.354360 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3033d290-d147-4727-8d61-0dabed08e76d" containerName="console" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.354382 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3033d290-d147-4727-8d61-0dabed08e76d" containerName="console" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.354515 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3033d290-d147-4727-8d61-0dabed08e76d" containerName="console" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.355484 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.358108 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.363929 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5"] Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.527205 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.527581 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.527615 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rll2\" (UniqueName: \"kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.628324 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.628389 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rll2\" (UniqueName: \"kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.628422 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.628924 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.629172 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.646526 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rll2\" (UniqueName: \"kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.703523 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:24 crc kubenswrapper[4731]: I1203 19:06:24.884790 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5"] Dec 03 19:06:25 crc kubenswrapper[4731]: I1203 19:06:25.463529 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" event={"ID":"e820b8d7-cbff-4738-9618-0b1744a2bd9c","Type":"ContainerStarted","Data":"51282bc8124ab642c9cc64eca15837394f0f5911e506c47b0c10a531b3e4f12e"} Dec 03 19:06:26 crc kubenswrapper[4731]: I1203 19:06:26.468945 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:06:26 crc kubenswrapper[4731]: I1203 19:06:26.469033 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:06:26 crc kubenswrapper[4731]: I1203 19:06:26.471759 4731 generic.go:334] "Generic (PLEG): container finished" podID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerID="790117a2a7fdec4817f0d9e93b47817552f3b4214abe699f3e1b7f587d9ea49a" exitCode=0 Dec 03 19:06:26 crc kubenswrapper[4731]: I1203 19:06:26.471799 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" event={"ID":"e820b8d7-cbff-4738-9618-0b1744a2bd9c","Type":"ContainerDied","Data":"790117a2a7fdec4817f0d9e93b47817552f3b4214abe699f3e1b7f587d9ea49a"} Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.120749 4731 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.716823 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.719739 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.729527 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.873843 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.873886 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9lsl\" (UniqueName: \"kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.873977 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.975582 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9lsl\" (UniqueName: \"kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.975712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.975740 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.976608 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:27 crc kubenswrapper[4731]: I1203 19:06:27.976845 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.003645 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9lsl\" (UniqueName: \"kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl\") pod \"redhat-operators-nsx2m\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.067442 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.359679 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:28 crc kubenswrapper[4731]: W1203 19:06:28.364873 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a1f375_17f3_4d66_a6a3_4861594455da.slice/crio-1a1316353354b7d9f8ee9a858e237a0c0b355ddb36ec44c75786f2879acbd1b2 WatchSource:0}: Error finding container 1a1316353354b7d9f8ee9a858e237a0c0b355ddb36ec44c75786f2879acbd1b2: Status 404 returned error can't find the container with id 1a1316353354b7d9f8ee9a858e237a0c0b355ddb36ec44c75786f2879acbd1b2 Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.483066 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerStarted","Data":"1a1316353354b7d9f8ee9a858e237a0c0b355ddb36ec44c75786f2879acbd1b2"} Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.484633 4731 generic.go:334] "Generic (PLEG): container finished" podID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerID="54b44c856a5da8254c530297bbace819ea12ffd2ac58e9078bafc4d140b9323a" exitCode=0 Dec 03 19:06:28 crc kubenswrapper[4731]: I1203 19:06:28.484667 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" event={"ID":"e820b8d7-cbff-4738-9618-0b1744a2bd9c","Type":"ContainerDied","Data":"54b44c856a5da8254c530297bbace819ea12ffd2ac58e9078bafc4d140b9323a"} Dec 03 19:06:29 crc kubenswrapper[4731]: I1203 19:06:29.493287 4731 generic.go:334] "Generic (PLEG): container finished" podID="16a1f375-17f3-4d66-a6a3-4861594455da" containerID="6d6d63c8e624a678592f5e219d6ed331edb3ddd2de1ceb1e71a0185f54426977" exitCode=0 Dec 03 19:06:29 crc kubenswrapper[4731]: I1203 19:06:29.493367 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerDied","Data":"6d6d63c8e624a678592f5e219d6ed331edb3ddd2de1ceb1e71a0185f54426977"} Dec 03 19:06:29 crc kubenswrapper[4731]: I1203 19:06:29.496780 4731 generic.go:334] "Generic (PLEG): container finished" podID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerID="7d4a3cf84aeb6029fdc2596fc6eefe349c755445686eff99a8ed146d6073986e" exitCode=0 Dec 03 19:06:29 crc kubenswrapper[4731]: I1203 19:06:29.496836 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" event={"ID":"e820b8d7-cbff-4738-9618-0b1744a2bd9c","Type":"ContainerDied","Data":"7d4a3cf84aeb6029fdc2596fc6eefe349c755445686eff99a8ed146d6073986e"} Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.824215 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.939988 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle\") pod \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.940076 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rll2\" (UniqueName: \"kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2\") pod \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.940114 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util\") pod \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\" (UID: \"e820b8d7-cbff-4738-9618-0b1744a2bd9c\") " Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.941555 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle" (OuterVolumeSpecName: "bundle") pod "e820b8d7-cbff-4738-9618-0b1744a2bd9c" (UID: "e820b8d7-cbff-4738-9618-0b1744a2bd9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.947353 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2" (OuterVolumeSpecName: "kube-api-access-6rll2") pod "e820b8d7-cbff-4738-9618-0b1744a2bd9c" (UID: "e820b8d7-cbff-4738-9618-0b1744a2bd9c"). InnerVolumeSpecName "kube-api-access-6rll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:06:30 crc kubenswrapper[4731]: I1203 19:06:30.960336 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util" (OuterVolumeSpecName: "util") pod "e820b8d7-cbff-4738-9618-0b1744a2bd9c" (UID: "e820b8d7-cbff-4738-9618-0b1744a2bd9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.041512 4731 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.041561 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rll2\" (UniqueName: \"kubernetes.io/projected/e820b8d7-cbff-4738-9618-0b1744a2bd9c-kube-api-access-6rll2\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.041586 4731 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e820b8d7-cbff-4738-9618-0b1744a2bd9c-util\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.521288 4731 generic.go:334] "Generic (PLEG): container finished" podID="16a1f375-17f3-4d66-a6a3-4861594455da" containerID="56f96411ce3aafcae0fbe4d1ab311e6d6e01289ee013500cabac2f898c4d4fea" exitCode=0 Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.521433 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerDied","Data":"56f96411ce3aafcae0fbe4d1ab311e6d6e01289ee013500cabac2f898c4d4fea"} Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.526185 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" event={"ID":"e820b8d7-cbff-4738-9618-0b1744a2bd9c","Type":"ContainerDied","Data":"51282bc8124ab642c9cc64eca15837394f0f5911e506c47b0c10a531b3e4f12e"} Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.526160 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5" Dec 03 19:06:31 crc kubenswrapper[4731]: I1203 19:06:31.526295 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51282bc8124ab642c9cc64eca15837394f0f5911e506c47b0c10a531b3e4f12e" Dec 03 19:06:32 crc kubenswrapper[4731]: I1203 19:06:32.535841 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerStarted","Data":"cf3315129b30169ec4bf2925c3588e7f007f5c4bbef90f5be61fc54dba215bec"} Dec 03 19:06:32 crc kubenswrapper[4731]: I1203 19:06:32.563330 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nsx2m" podStartSLOduration=3.114986801 podStartE2EDuration="5.563311976s" podCreationTimestamp="2025-12-03 19:06:27 +0000 UTC" firstStartedPulling="2025-12-03 19:06:29.496484399 +0000 UTC m=+710.095078873" lastFinishedPulling="2025-12-03 19:06:31.944809544 +0000 UTC m=+712.543404048" observedRunningTime="2025-12-03 19:06:32.55901942 +0000 UTC m=+713.157613884" watchObservedRunningTime="2025-12-03 19:06:32.563311976 +0000 UTC m=+713.161906440" Dec 03 19:06:38 crc kubenswrapper[4731]: I1203 19:06:38.068669 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:38 crc kubenswrapper[4731]: I1203 19:06:38.071192 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:38 crc kubenswrapper[4731]: I1203 19:06:38.115186 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:38 crc kubenswrapper[4731]: I1203 19:06:38.614788 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:39 crc kubenswrapper[4731]: I1203 19:06:39.694672 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:41 crc kubenswrapper[4731]: I1203 19:06:41.590534 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nsx2m" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="registry-server" containerID="cri-o://cf3315129b30169ec4bf2925c3588e7f007f5c4bbef90f5be61fc54dba215bec" gracePeriod=2 Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.580737 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv"] Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.581482 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="extract" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.581497 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="extract" Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.581506 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="util" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.581513 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="util" Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.581528 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="pull" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.581534 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="pull" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.601742 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e820b8d7-cbff-4738-9618-0b1744a2bd9c" containerName="extract" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.602759 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.609809 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-97qbz" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.610022 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.610227 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.611040 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.611370 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.621823 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv"] Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.657141 4731 generic.go:334] "Generic (PLEG): container finished" podID="16a1f375-17f3-4d66-a6a3-4861594455da" containerID="cf3315129b30169ec4bf2925c3588e7f007f5c4bbef90f5be61fc54dba215bec" exitCode=0 Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.657206 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerDied","Data":"cf3315129b30169ec4bf2925c3588e7f007f5c4bbef90f5be61fc54dba215bec"} Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.677030 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-apiservice-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.677096 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-webhook-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.677125 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9psl\" (UniqueName: \"kubernetes.io/projected/6a724a74-ca11-42ef-8a1f-96665b3a6773-kube-api-access-c9psl\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.709420 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.778489 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content\") pod \"16a1f375-17f3-4d66-a6a3-4861594455da\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.778563 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities\") pod \"16a1f375-17f3-4d66-a6a3-4861594455da\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.778657 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9lsl\" (UniqueName: \"kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl\") pod \"16a1f375-17f3-4d66-a6a3-4861594455da\" (UID: \"16a1f375-17f3-4d66-a6a3-4861594455da\") " Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.778978 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-apiservice-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.779033 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-webhook-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.779058 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9psl\" (UniqueName: \"kubernetes.io/projected/6a724a74-ca11-42ef-8a1f-96665b3a6773-kube-api-access-c9psl\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.779782 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities" (OuterVolumeSpecName: "utilities") pod "16a1f375-17f3-4d66-a6a3-4861594455da" (UID: "16a1f375-17f3-4d66-a6a3-4861594455da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.785566 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-apiservice-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.790928 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a724a74-ca11-42ef-8a1f-96665b3a6773-webhook-cert\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.796762 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9psl\" (UniqueName: \"kubernetes.io/projected/6a724a74-ca11-42ef-8a1f-96665b3a6773-kube-api-access-c9psl\") pod \"metallb-operator-controller-manager-7686ff65d7-9nhhv\" (UID: \"6a724a74-ca11-42ef-8a1f-96665b3a6773\") " pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.811479 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl" (OuterVolumeSpecName: "kube-api-access-g9lsl") pod "16a1f375-17f3-4d66-a6a3-4861594455da" (UID: "16a1f375-17f3-4d66-a6a3-4861594455da"). InnerVolumeSpecName "kube-api-access-g9lsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.836452 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79695f5758-49qdw"] Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.838131 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="extract-utilities" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.838161 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="extract-utilities" Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.838179 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="registry-server" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.838186 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="registry-server" Dec 03 19:06:44 crc kubenswrapper[4731]: E1203 19:06:44.838200 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="extract-content" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.838209 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="extract-content" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.838357 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" containerName="registry-server" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.838961 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.840927 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6f9hp" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.841360 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.841619 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.863801 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79695f5758-49qdw"] Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.881938 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjbwl\" (UniqueName: \"kubernetes.io/projected/2255151e-dced-4ba0-8329-89984ef8583d-kube-api-access-mjbwl\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.882055 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-webhook-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.882102 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-apiservice-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.882183 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9lsl\" (UniqueName: \"kubernetes.io/projected/16a1f375-17f3-4d66-a6a3-4861594455da-kube-api-access-g9lsl\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.882201 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.920409 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16a1f375-17f3-4d66-a6a3-4861594455da" (UID: "16a1f375-17f3-4d66-a6a3-4861594455da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.931426 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.984649 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-apiservice-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.984749 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjbwl\" (UniqueName: \"kubernetes.io/projected/2255151e-dced-4ba0-8329-89984ef8583d-kube-api-access-mjbwl\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.984795 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-webhook-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.984834 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a1f375-17f3-4d66-a6a3-4861594455da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.988758 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-apiservice-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:44 crc kubenswrapper[4731]: I1203 19:06:44.989247 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2255151e-dced-4ba0-8329-89984ef8583d-webhook-cert\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.003115 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjbwl\" (UniqueName: \"kubernetes.io/projected/2255151e-dced-4ba0-8329-89984ef8583d-kube-api-access-mjbwl\") pod \"metallb-operator-webhook-server-79695f5758-49qdw\" (UID: \"2255151e-dced-4ba0-8329-89984ef8583d\") " pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.155808 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.212532 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv"] Dec 03 19:06:45 crc kubenswrapper[4731]: W1203 19:06:45.237573 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a724a74_ca11_42ef_8a1f_96665b3a6773.slice/crio-ed7edcbf25385b87c24573df45d928f84bb79f4265cb6be3bc1f5bd15e9dd0fc WatchSource:0}: Error finding container ed7edcbf25385b87c24573df45d928f84bb79f4265cb6be3bc1f5bd15e9dd0fc: Status 404 returned error can't find the container with id ed7edcbf25385b87c24573df45d928f84bb79f4265cb6be3bc1f5bd15e9dd0fc Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.530063 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79695f5758-49qdw"] Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.668532 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsx2m" event={"ID":"16a1f375-17f3-4d66-a6a3-4861594455da","Type":"ContainerDied","Data":"1a1316353354b7d9f8ee9a858e237a0c0b355ddb36ec44c75786f2879acbd1b2"} Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.668810 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsx2m" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.668875 4731 scope.go:117] "RemoveContainer" containerID="cf3315129b30169ec4bf2925c3588e7f007f5c4bbef90f5be61fc54dba215bec" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.670866 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" event={"ID":"2255151e-dced-4ba0-8329-89984ef8583d","Type":"ContainerStarted","Data":"f22116a99ae908020b54620d756f96ce88b730f7de40c6264f013dffb10c8017"} Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.672197 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" event={"ID":"6a724a74-ca11-42ef-8a1f-96665b3a6773","Type":"ContainerStarted","Data":"ed7edcbf25385b87c24573df45d928f84bb79f4265cb6be3bc1f5bd15e9dd0fc"} Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.701113 4731 scope.go:117] "RemoveContainer" containerID="56f96411ce3aafcae0fbe4d1ab311e6d6e01289ee013500cabac2f898c4d4fea" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.709705 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.715170 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nsx2m"] Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.732082 4731 scope.go:117] "RemoveContainer" containerID="6d6d63c8e624a678592f5e219d6ed331edb3ddd2de1ceb1e71a0185f54426977" Dec 03 19:06:45 crc kubenswrapper[4731]: I1203 19:06:45.862985 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a1f375-17f3-4d66-a6a3-4861594455da" path="/var/lib/kubelet/pods/16a1f375-17f3-4d66-a6a3-4861594455da/volumes" Dec 03 19:06:48 crc kubenswrapper[4731]: I1203 19:06:48.705370 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" event={"ID":"6a724a74-ca11-42ef-8a1f-96665b3a6773","Type":"ContainerStarted","Data":"17d22fd08499e74e5c924f05a8c591089c433f6f1c5dfe5e7f8621102c2bffa9"} Dec 03 19:06:48 crc kubenswrapper[4731]: I1203 19:06:48.706762 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:06:48 crc kubenswrapper[4731]: I1203 19:06:48.735807 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" podStartSLOduration=1.8175887080000002 podStartE2EDuration="4.735785579s" podCreationTimestamp="2025-12-03 19:06:44 +0000 UTC" firstStartedPulling="2025-12-03 19:06:45.245881931 +0000 UTC m=+725.844476395" lastFinishedPulling="2025-12-03 19:06:48.164078802 +0000 UTC m=+728.762673266" observedRunningTime="2025-12-03 19:06:48.722592262 +0000 UTC m=+729.321186746" watchObservedRunningTime="2025-12-03 19:06:48.735785579 +0000 UTC m=+729.334380043" Dec 03 19:06:50 crc kubenswrapper[4731]: I1203 19:06:50.720689 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" event={"ID":"2255151e-dced-4ba0-8329-89984ef8583d","Type":"ContainerStarted","Data":"8340c0591007f582bed470574bbbd6bb0ce72a31d5b6dfc228df2a7a9f6d4cd4"} Dec 03 19:06:50 crc kubenswrapper[4731]: I1203 19:06:50.721070 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:06:50 crc kubenswrapper[4731]: I1203 19:06:50.738429 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" podStartSLOduration=1.928886285 podStartE2EDuration="6.738403062s" podCreationTimestamp="2025-12-03 19:06:44 +0000 UTC" firstStartedPulling="2025-12-03 19:06:45.5455636 +0000 UTC m=+726.144158064" lastFinishedPulling="2025-12-03 19:06:50.355080377 +0000 UTC m=+730.953674841" observedRunningTime="2025-12-03 19:06:50.735433331 +0000 UTC m=+731.334027795" watchObservedRunningTime="2025-12-03 19:06:50.738403062 +0000 UTC m=+731.336997516" Dec 03 19:06:56 crc kubenswrapper[4731]: I1203 19:06:56.468890 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:06:56 crc kubenswrapper[4731]: I1203 19:06:56.469470 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:07:05 crc kubenswrapper[4731]: I1203 19:07:05.169286 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79695f5758-49qdw" Dec 03 19:07:24 crc kubenswrapper[4731]: I1203 19:07:24.935198 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7686ff65d7-9nhhv" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.768822 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vxn82"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.771114 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.773665 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.773870 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nsxxv" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.777933 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.778717 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.780490 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.784299 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.785578 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.864043 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8zkk6"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.865072 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8zkk6" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.867968 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.867968 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-cwbg7" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.867979 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.867981 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.888514 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-rs9fx"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.889984 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.892143 4731 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.902874 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-rs9fx"] Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.935926 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-reloader\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937602 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-startup\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937723 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937822 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfr4d\" (UniqueName: \"kubernetes.io/projected/974efcbe-f04c-46e1-b9d6-4cd2a537db71-kube-api-access-zfr4d\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937865 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937935 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fsh\" (UniqueName: \"kubernetes.io/projected/e655adda-c328-44ce-a95e-7cfe44ce671c-kube-api-access-49fsh\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.937984 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-sockets\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.938021 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-conf\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:25 crc kubenswrapper[4731]: I1203 19:07:25.938070 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.039519 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-sockets\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.039893 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fsh\" (UniqueName: \"kubernetes.io/projected/e655adda-c328-44ce-a95e-7cfe44ce671c-kube-api-access-49fsh\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.039996 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-conf\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040099 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040203 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040321 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce748c4e-5fb2-4792-bd00-3294f9d85144-metallb-excludel2\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040427 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x586\" (UniqueName: \"kubernetes.io/projected/a0a92812-f3d1-4c8b-816d-034b6cdc1438-kube-api-access-6x586\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040537 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftj5\" (UniqueName: \"kubernetes.io/projected/ce748c4e-5fb2-4792-bd00-3294f9d85144-kube-api-access-5ftj5\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.040627 4731 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.040692 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert podName:e655adda-c328-44ce-a95e-7cfe44ce671c nodeName:}" failed. No retries permitted until 2025-12-03 19:07:26.540673681 +0000 UTC m=+767.139268145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert") pod "frr-k8s-webhook-server-7fcb986d4-s9r7r" (UID: "e655adda-c328-44ce-a95e-7cfe44ce671c") : secret "frr-k8s-webhook-server-cert" not found Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040021 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-sockets\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040640 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-metrics-certs\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.040542 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-conf\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041801 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-cert\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041858 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-reloader\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041899 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-startup\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041938 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041976 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfr4d\" (UniqueName: \"kubernetes.io/projected/974efcbe-f04c-46e1-b9d6-4cd2a537db71-kube-api-access-zfr4d\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.041999 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.042018 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-metrics-certs\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.042238 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-reloader\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.043157 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/974efcbe-f04c-46e1-b9d6-4cd2a537db71-frr-startup\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.043645 4731 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.043761 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs podName:974efcbe-f04c-46e1-b9d6-4cd2a537db71 nodeName:}" failed. No retries permitted until 2025-12-03 19:07:26.543734165 +0000 UTC m=+767.142328619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs") pod "frr-k8s-vxn82" (UID: "974efcbe-f04c-46e1-b9d6-4cd2a537db71") : secret "frr-k8s-certs-secret" not found Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.044141 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.064066 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfr4d\" (UniqueName: \"kubernetes.io/projected/974efcbe-f04c-46e1-b9d6-4cd2a537db71-kube-api-access-zfr4d\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.064307 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fsh\" (UniqueName: \"kubernetes.io/projected/e655adda-c328-44ce-a95e-7cfe44ce671c-kube-api-access-49fsh\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.143742 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x586\" (UniqueName: \"kubernetes.io/projected/a0a92812-f3d1-4c8b-816d-034b6cdc1438-kube-api-access-6x586\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144190 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftj5\" (UniqueName: \"kubernetes.io/projected/ce748c4e-5fb2-4792-bd00-3294f9d85144-kube-api-access-5ftj5\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144347 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-metrics-certs\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144470 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-cert\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144633 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-metrics-certs\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144766 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.144853 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce748c4e-5fb2-4792-bd00-3294f9d85144-metallb-excludel2\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.144999 4731 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.145104 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist podName:ce748c4e-5fb2-4792-bd00-3294f9d85144 nodeName:}" failed. No retries permitted until 2025-12-03 19:07:26.645079816 +0000 UTC m=+767.243674280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist") pod "speaker-8zkk6" (UID: "ce748c4e-5fb2-4792-bd00-3294f9d85144") : secret "metallb-memberlist" not found Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.145683 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce748c4e-5fb2-4792-bd00-3294f9d85144-metallb-excludel2\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.148632 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-metrics-certs\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.148633 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-metrics-certs\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.149516 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0a92812-f3d1-4c8b-816d-034b6cdc1438-cert\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.162769 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftj5\" (UniqueName: \"kubernetes.io/projected/ce748c4e-5fb2-4792-bd00-3294f9d85144-kube-api-access-5ftj5\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.172137 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x586\" (UniqueName: \"kubernetes.io/projected/a0a92812-f3d1-4c8b-816d-034b6cdc1438-kube-api-access-6x586\") pod \"controller-f8648f98b-rs9fx\" (UID: \"a0a92812-f3d1-4c8b-816d-034b6cdc1438\") " pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.203039 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.468819 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.469149 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.469196 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.469838 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.469895 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b" gracePeriod=600 Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.479118 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-rs9fx"] Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.551580 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.551708 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.557395 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974efcbe-f04c-46e1-b9d6-4cd2a537db71-metrics-certs\") pod \"frr-k8s-vxn82\" (UID: \"974efcbe-f04c-46e1-b9d6-4cd2a537db71\") " pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.557449 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e655adda-c328-44ce-a95e-7cfe44ce671c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-s9r7r\" (UID: \"e655adda-c328-44ce-a95e-7cfe44ce671c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.658076 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.658306 4731 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 19:07:26 crc kubenswrapper[4731]: E1203 19:07:26.658400 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist podName:ce748c4e-5fb2-4792-bd00-3294f9d85144 nodeName:}" failed. No retries permitted until 2025-12-03 19:07:27.658376344 +0000 UTC m=+768.256970808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist") pod "speaker-8zkk6" (UID: "ce748c4e-5fb2-4792-bd00-3294f9d85144") : secret "metallb-memberlist" not found Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.689650 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.698228 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.934735 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r"] Dec 03 19:07:26 crc kubenswrapper[4731]: W1203 19:07:26.942869 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode655adda_c328_44ce_a95e_7cfe44ce671c.slice/crio-d69f0be228b5deea1aa94640b20a9f19c70a76abf06e8e99d507f3680193a20a WatchSource:0}: Error finding container d69f0be228b5deea1aa94640b20a9f19c70a76abf06e8e99d507f3680193a20a: Status 404 returned error can't find the container with id d69f0be228b5deea1aa94640b20a9f19c70a76abf06e8e99d507f3680193a20a Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.978622 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"3c2168fea032c0f6222fb5dbf84e4e9b2f7de734a4df73321e0bffc5e1838dcf"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.979749 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" event={"ID":"e655adda-c328-44ce-a95e-7cfe44ce671c","Type":"ContainerStarted","Data":"d69f0be228b5deea1aa94640b20a9f19c70a76abf06e8e99d507f3680193a20a"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.982489 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b" exitCode=0 Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.982549 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.982583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.982602 4731 scope.go:117] "RemoveContainer" containerID="b45c9a750fcf751b97bdf7b9a62b87ccd567866e7ff290c0aea383c7e095916a" Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.985078 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rs9fx" event={"ID":"a0a92812-f3d1-4c8b-816d-034b6cdc1438","Type":"ContainerStarted","Data":"e250b0648d042816a096cdc7f33fb05e0e4455ae71d36cb4f011ac178e68507b"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.985126 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rs9fx" event={"ID":"a0a92812-f3d1-4c8b-816d-034b6cdc1438","Type":"ContainerStarted","Data":"37d5c52df72efad3a30e155c7f06c60d51deaf003c78a7164c6aaccbbea717a0"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.985140 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rs9fx" event={"ID":"a0a92812-f3d1-4c8b-816d-034b6cdc1438","Type":"ContainerStarted","Data":"7fc0b06d73d399f6549fe8f78f2bf50465b95728bd66e63271208d0fbae059d7"} Dec 03 19:07:26 crc kubenswrapper[4731]: I1203 19:07:26.985278 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:27 crc kubenswrapper[4731]: I1203 19:07:27.021688 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-rs9fx" podStartSLOduration=2.021660342 podStartE2EDuration="2.021660342s" podCreationTimestamp="2025-12-03 19:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:07:27.01930964 +0000 UTC m=+767.617904114" watchObservedRunningTime="2025-12-03 19:07:27.021660342 +0000 UTC m=+767.620254826" Dec 03 19:07:27 crc kubenswrapper[4731]: I1203 19:07:27.672177 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:27 crc kubenswrapper[4731]: I1203 19:07:27.679111 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce748c4e-5fb2-4792-bd00-3294f9d85144-memberlist\") pod \"speaker-8zkk6\" (UID: \"ce748c4e-5fb2-4792-bd00-3294f9d85144\") " pod="metallb-system/speaker-8zkk6" Dec 03 19:07:27 crc kubenswrapper[4731]: I1203 19:07:27.978740 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8zkk6" Dec 03 19:07:28 crc kubenswrapper[4731]: W1203 19:07:28.021499 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce748c4e_5fb2_4792_bd00_3294f9d85144.slice/crio-476cb46a175ea8550e031f5622816dd1c70cc192a89184baca042a2f3f81bf7f WatchSource:0}: Error finding container 476cb46a175ea8550e031f5622816dd1c70cc192a89184baca042a2f3f81bf7f: Status 404 returned error can't find the container with id 476cb46a175ea8550e031f5622816dd1c70cc192a89184baca042a2f3f81bf7f Dec 03 19:07:29 crc kubenswrapper[4731]: I1203 19:07:29.004164 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8zkk6" event={"ID":"ce748c4e-5fb2-4792-bd00-3294f9d85144","Type":"ContainerStarted","Data":"c0ce634ae09cffc068fff9cfc64de81ecbbfac372aab382b434bb82b5e902a08"} Dec 03 19:07:29 crc kubenswrapper[4731]: I1203 19:07:29.004570 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8zkk6" event={"ID":"ce748c4e-5fb2-4792-bd00-3294f9d85144","Type":"ContainerStarted","Data":"aeb5f4ad40e57a277f80aeeb7e325ca566ab4f1491a2957cb5d8d0284d38bc11"} Dec 03 19:07:29 crc kubenswrapper[4731]: I1203 19:07:29.004588 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8zkk6" event={"ID":"ce748c4e-5fb2-4792-bd00-3294f9d85144","Type":"ContainerStarted","Data":"476cb46a175ea8550e031f5622816dd1c70cc192a89184baca042a2f3f81bf7f"} Dec 03 19:07:29 crc kubenswrapper[4731]: I1203 19:07:29.004840 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8zkk6" Dec 03 19:07:29 crc kubenswrapper[4731]: I1203 19:07:29.033967 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8zkk6" podStartSLOduration=4.033943744 podStartE2EDuration="4.033943744s" podCreationTimestamp="2025-12-03 19:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:07:29.031436797 +0000 UTC m=+769.630031261" watchObservedRunningTime="2025-12-03 19:07:29.033943744 +0000 UTC m=+769.632538228" Dec 03 19:07:35 crc kubenswrapper[4731]: I1203 19:07:35.053512 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" event={"ID":"e655adda-c328-44ce-a95e-7cfe44ce671c","Type":"ContainerStarted","Data":"e616af632ea608ac8a7cab900a84b94ac9ff72f8d1bf22be996e8f3b8489cd27"} Dec 03 19:07:35 crc kubenswrapper[4731]: I1203 19:07:35.054112 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:35 crc kubenswrapper[4731]: I1203 19:07:35.056435 4731 generic.go:334] "Generic (PLEG): container finished" podID="974efcbe-f04c-46e1-b9d6-4cd2a537db71" containerID="10afc6c758536afdc267a1c3150c12a2b832b170005a0d0e909fc5d122de95b4" exitCode=0 Dec 03 19:07:35 crc kubenswrapper[4731]: I1203 19:07:35.056506 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerDied","Data":"10afc6c758536afdc267a1c3150c12a2b832b170005a0d0e909fc5d122de95b4"} Dec 03 19:07:35 crc kubenswrapper[4731]: I1203 19:07:35.081868 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" podStartSLOduration=2.423480706 podStartE2EDuration="10.081422437s" podCreationTimestamp="2025-12-03 19:07:25 +0000 UTC" firstStartedPulling="2025-12-03 19:07:26.945876878 +0000 UTC m=+767.544471332" lastFinishedPulling="2025-12-03 19:07:34.603818589 +0000 UTC m=+775.202413063" observedRunningTime="2025-12-03 19:07:35.079634042 +0000 UTC m=+775.678228496" watchObservedRunningTime="2025-12-03 19:07:35.081422437 +0000 UTC m=+775.680016901" Dec 03 19:07:36 crc kubenswrapper[4731]: I1203 19:07:36.064883 4731 generic.go:334] "Generic (PLEG): container finished" podID="974efcbe-f04c-46e1-b9d6-4cd2a537db71" containerID="0d9b27d13308fdddfc200bed4ed77b5b60642be105a1edaa3d15385180f18668" exitCode=0 Dec 03 19:07:36 crc kubenswrapper[4731]: I1203 19:07:36.065022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerDied","Data":"0d9b27d13308fdddfc200bed4ed77b5b60642be105a1edaa3d15385180f18668"} Dec 03 19:07:36 crc kubenswrapper[4731]: I1203 19:07:36.206826 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-rs9fx" Dec 03 19:07:37 crc kubenswrapper[4731]: I1203 19:07:37.073666 4731 generic.go:334] "Generic (PLEG): container finished" podID="974efcbe-f04c-46e1-b9d6-4cd2a537db71" containerID="e0eeecfed036ff0943087d4129ef382ebcff5731fcb99c5d2c6ff0cc3e6afded" exitCode=0 Dec 03 19:07:37 crc kubenswrapper[4731]: I1203 19:07:37.073718 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerDied","Data":"e0eeecfed036ff0943087d4129ef382ebcff5731fcb99c5d2c6ff0cc3e6afded"} Dec 03 19:07:38 crc kubenswrapper[4731]: I1203 19:07:38.083819 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"35c7dd8db5704cca5823f8fae4c48c0e5b2f91cb45aa3cddf0bb6d101e48b5ad"} Dec 03 19:07:38 crc kubenswrapper[4731]: I1203 19:07:38.083882 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"1cb30c56b60db12e06668cb1ea3309939b87500d2dc7bf6817bff2bb3639ef73"} Dec 03 19:07:38 crc kubenswrapper[4731]: I1203 19:07:38.083898 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"f223de5c024bce234a1c2eed05952e622221e78917a582c72e573edb323a4103"} Dec 03 19:07:38 crc kubenswrapper[4731]: I1203 19:07:38.083912 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"d32a0709afa93e9a2a015ad04b124aa09b1938c2b5020e24a1b463a23b421cd1"} Dec 03 19:07:38 crc kubenswrapper[4731]: I1203 19:07:38.083925 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"7066cf271684c0dcb56a24652d619884fff796b44e0113af3b995e47d9870172"} Dec 03 19:07:39 crc kubenswrapper[4731]: I1203 19:07:39.096312 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vxn82" event={"ID":"974efcbe-f04c-46e1-b9d6-4cd2a537db71","Type":"ContainerStarted","Data":"e8d301f0ef51ccd6bf209fccb1877d59cb491d01fb8ab9b56783938fcc09c4b0"} Dec 03 19:07:39 crc kubenswrapper[4731]: I1203 19:07:39.096837 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:39 crc kubenswrapper[4731]: I1203 19:07:39.132947 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vxn82" podStartSLOduration=6.319345176 podStartE2EDuration="14.132919s" podCreationTimestamp="2025-12-03 19:07:25 +0000 UTC" firstStartedPulling="2025-12-03 19:07:26.838515892 +0000 UTC m=+767.437110356" lastFinishedPulling="2025-12-03 19:07:34.652089706 +0000 UTC m=+775.250684180" observedRunningTime="2025-12-03 19:07:39.126876204 +0000 UTC m=+779.725470738" watchObservedRunningTime="2025-12-03 19:07:39.132919 +0000 UTC m=+779.731513494" Dec 03 19:07:41 crc kubenswrapper[4731]: I1203 19:07:41.690292 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:41 crc kubenswrapper[4731]: I1203 19:07:41.728046 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:46 crc kubenswrapper[4731]: I1203 19:07:46.703401 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-s9r7r" Dec 03 19:07:47 crc kubenswrapper[4731]: I1203 19:07:47.983894 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8zkk6" Dec 03 19:07:50 crc kubenswrapper[4731]: I1203 19:07:50.998410 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.015653 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.023723 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.023850 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.023760 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-qn8pk" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.051898 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.052373 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnjvr\" (UniqueName: \"kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr\") pod \"openstack-operator-index-s7qzs\" (UID: \"a1b901c9-33b3-4055-b542-5680f0e6e5fb\") " pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.153252 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnjvr\" (UniqueName: \"kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr\") pod \"openstack-operator-index-s7qzs\" (UID: \"a1b901c9-33b3-4055-b542-5680f0e6e5fb\") " pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.172038 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnjvr\" (UniqueName: \"kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr\") pod \"openstack-operator-index-s7qzs\" (UID: \"a1b901c9-33b3-4055-b542-5680f0e6e5fb\") " pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.368462 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:51 crc kubenswrapper[4731]: I1203 19:07:51.814241 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:51 crc kubenswrapper[4731]: W1203 19:07:51.821679 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b901c9_33b3_4055_b542_5680f0e6e5fb.slice/crio-bad805c7b48b7477d9fc92fdd3140501de3d42aec7ed9f0a40bc6ac439d675ed WatchSource:0}: Error finding container bad805c7b48b7477d9fc92fdd3140501de3d42aec7ed9f0a40bc6ac439d675ed: Status 404 returned error can't find the container with id bad805c7b48b7477d9fc92fdd3140501de3d42aec7ed9f0a40bc6ac439d675ed Dec 03 19:07:52 crc kubenswrapper[4731]: I1203 19:07:52.191131 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7qzs" event={"ID":"a1b901c9-33b3-4055-b542-5680f0e6e5fb","Type":"ContainerStarted","Data":"bad805c7b48b7477d9fc92fdd3140501de3d42aec7ed9f0a40bc6ac439d675ed"} Dec 03 19:07:54 crc kubenswrapper[4731]: I1203 19:07:54.376865 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:54 crc kubenswrapper[4731]: I1203 19:07:54.982012 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x5lwc"] Dec 03 19:07:54 crc kubenswrapper[4731]: I1203 19:07:54.982793 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.002908 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x5lwc"] Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.008162 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55ph7\" (UniqueName: \"kubernetes.io/projected/b7d47a7b-9fb4-4431-99bc-fac95d81a4cf-kube-api-access-55ph7\") pod \"openstack-operator-index-x5lwc\" (UID: \"b7d47a7b-9fb4-4431-99bc-fac95d81a4cf\") " pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.109956 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55ph7\" (UniqueName: \"kubernetes.io/projected/b7d47a7b-9fb4-4431-99bc-fac95d81a4cf-kube-api-access-55ph7\") pod \"openstack-operator-index-x5lwc\" (UID: \"b7d47a7b-9fb4-4431-99bc-fac95d81a4cf\") " pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.127641 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55ph7\" (UniqueName: \"kubernetes.io/projected/b7d47a7b-9fb4-4431-99bc-fac95d81a4cf-kube-api-access-55ph7\") pod \"openstack-operator-index-x5lwc\" (UID: \"b7d47a7b-9fb4-4431-99bc-fac95d81a4cf\") " pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.351613 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:07:55 crc kubenswrapper[4731]: I1203 19:07:55.753888 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x5lwc"] Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.219929 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7qzs" event={"ID":"a1b901c9-33b3-4055-b542-5680f0e6e5fb","Type":"ContainerStarted","Data":"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc"} Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.220088 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-s7qzs" podUID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" containerName="registry-server" containerID="cri-o://b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc" gracePeriod=2 Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.222657 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5lwc" event={"ID":"b7d47a7b-9fb4-4431-99bc-fac95d81a4cf","Type":"ContainerStarted","Data":"ccae3993b657030ba9b590879c1371c0a06ef488acc82e300e2ea20bdc35f968"} Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.222714 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5lwc" event={"ID":"b7d47a7b-9fb4-4431-99bc-fac95d81a4cf","Type":"ContainerStarted","Data":"814c564497fd1ae94c2b56c3ebd8ef1175bc129f029dcc3f4cc9863bab5ccbeb"} Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.235847 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s7qzs" podStartSLOduration=2.750172048 podStartE2EDuration="6.235826165s" podCreationTimestamp="2025-12-03 19:07:50 +0000 UTC" firstStartedPulling="2025-12-03 19:07:51.823820838 +0000 UTC m=+792.422415302" lastFinishedPulling="2025-12-03 19:07:55.309474955 +0000 UTC m=+795.908069419" observedRunningTime="2025-12-03 19:07:56.23536419 +0000 UTC m=+796.833958664" watchObservedRunningTime="2025-12-03 19:07:56.235826165 +0000 UTC m=+796.834420649" Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.254531 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x5lwc" podStartSLOduration=2.1840974109999998 podStartE2EDuration="2.25451048s" podCreationTimestamp="2025-12-03 19:07:54 +0000 UTC" firstStartedPulling="2025-12-03 19:07:55.771086112 +0000 UTC m=+796.369680576" lastFinishedPulling="2025-12-03 19:07:55.841499191 +0000 UTC m=+796.440093645" observedRunningTime="2025-12-03 19:07:56.250017902 +0000 UTC m=+796.848612406" watchObservedRunningTime="2025-12-03 19:07:56.25451048 +0000 UTC m=+796.853104964" Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.542914 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.639578 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnjvr\" (UniqueName: \"kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr\") pod \"a1b901c9-33b3-4055-b542-5680f0e6e5fb\" (UID: \"a1b901c9-33b3-4055-b542-5680f0e6e5fb\") " Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.646216 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr" (OuterVolumeSpecName: "kube-api-access-rnjvr") pod "a1b901c9-33b3-4055-b542-5680f0e6e5fb" (UID: "a1b901c9-33b3-4055-b542-5680f0e6e5fb"). InnerVolumeSpecName "kube-api-access-rnjvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.694785 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vxn82" Dec 03 19:07:56 crc kubenswrapper[4731]: I1203 19:07:56.741530 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnjvr\" (UniqueName: \"kubernetes.io/projected/a1b901c9-33b3-4055-b542-5680f0e6e5fb-kube-api-access-rnjvr\") on node \"crc\" DevicePath \"\"" Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.230806 4731 generic.go:334] "Generic (PLEG): container finished" podID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" containerID="b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc" exitCode=0 Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.230896 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7qzs" Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.230958 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7qzs" event={"ID":"a1b901c9-33b3-4055-b542-5680f0e6e5fb","Type":"ContainerDied","Data":"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc"} Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.231008 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7qzs" event={"ID":"a1b901c9-33b3-4055-b542-5680f0e6e5fb","Type":"ContainerDied","Data":"bad805c7b48b7477d9fc92fdd3140501de3d42aec7ed9f0a40bc6ac439d675ed"} Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.231040 4731 scope.go:117] "RemoveContainer" containerID="b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc" Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.248916 4731 scope.go:117] "RemoveContainer" containerID="b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc" Dec 03 19:07:57 crc kubenswrapper[4731]: E1203 19:07:57.249868 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc\": container with ID starting with b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc not found: ID does not exist" containerID="b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc" Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.249993 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc"} err="failed to get container status \"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc\": rpc error: code = NotFound desc = could not find container \"b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc\": container with ID starting with b7934f76bc858f653a5adee840d8be7aa95e89e1ab7c4aa6a9cd8777e40335fc not found: ID does not exist" Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.270543 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.280445 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-s7qzs"] Dec 03 19:07:57 crc kubenswrapper[4731]: I1203 19:07:57.869366 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" path="/var/lib/kubelet/pods/a1b901c9-33b3-4055-b542-5680f0e6e5fb/volumes" Dec 03 19:08:05 crc kubenswrapper[4731]: I1203 19:08:05.352573 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:08:05 crc kubenswrapper[4731]: I1203 19:08:05.353140 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:08:05 crc kubenswrapper[4731]: I1203 19:08:05.377921 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:08:06 crc kubenswrapper[4731]: I1203 19:08:06.347501 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-x5lwc" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.047290 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz"] Dec 03 19:08:12 crc kubenswrapper[4731]: E1203 19:08:12.047936 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" containerName="registry-server" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.047952 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" containerName="registry-server" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.048109 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b901c9-33b3-4055-b542-5680f0e6e5fb" containerName="registry-server" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.049012 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.051438 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-g8kz6" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.064149 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz"] Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.182092 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mdq\" (UniqueName: \"kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.182218 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.182438 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.283794 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mdq\" (UniqueName: \"kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.283850 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.283878 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.284365 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.284561 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.306955 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mdq\" (UniqueName: \"kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq\") pod \"a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.380840 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:12 crc kubenswrapper[4731]: I1203 19:08:12.825712 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz"] Dec 03 19:08:13 crc kubenswrapper[4731]: I1203 19:08:13.346073 4731 generic.go:334] "Generic (PLEG): container finished" podID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerID="1ed2c3d3e9b2e86e8809f8a1c082eec982cfecab6df9f5098363014b92ef9d7c" exitCode=0 Dec 03 19:08:13 crc kubenswrapper[4731]: I1203 19:08:13.346159 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" event={"ID":"22d0de6c-28c2-44d2-8aef-c5680cb681aa","Type":"ContainerDied","Data":"1ed2c3d3e9b2e86e8809f8a1c082eec982cfecab6df9f5098363014b92ef9d7c"} Dec 03 19:08:13 crc kubenswrapper[4731]: I1203 19:08:13.346243 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" event={"ID":"22d0de6c-28c2-44d2-8aef-c5680cb681aa","Type":"ContainerStarted","Data":"557cbdea650fa8a988504a76e5e27aecc08b4598210ae5f4b82a0ce3bb750b52"} Dec 03 19:08:14 crc kubenswrapper[4731]: I1203 19:08:14.364447 4731 generic.go:334] "Generic (PLEG): container finished" podID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerID="21739be71bf209e8dd1bcca59700ce66d25faf8eccd80c540bf2a8b11d6e9f77" exitCode=0 Dec 03 19:08:14 crc kubenswrapper[4731]: I1203 19:08:14.364554 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" event={"ID":"22d0de6c-28c2-44d2-8aef-c5680cb681aa","Type":"ContainerDied","Data":"21739be71bf209e8dd1bcca59700ce66d25faf8eccd80c540bf2a8b11d6e9f77"} Dec 03 19:08:15 crc kubenswrapper[4731]: I1203 19:08:15.393187 4731 generic.go:334] "Generic (PLEG): container finished" podID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerID="ef066fe522567a23805cd3171170c2184c8733d2b68988311270489bc8315b1e" exitCode=0 Dec 03 19:08:15 crc kubenswrapper[4731]: I1203 19:08:15.393238 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" event={"ID":"22d0de6c-28c2-44d2-8aef-c5680cb681aa","Type":"ContainerDied","Data":"ef066fe522567a23805cd3171170c2184c8733d2b68988311270489bc8315b1e"} Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.728625 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.757813 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mdq\" (UniqueName: \"kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq\") pod \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.757889 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util\") pod \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.758018 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle\") pod \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\" (UID: \"22d0de6c-28c2-44d2-8aef-c5680cb681aa\") " Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.759275 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle" (OuterVolumeSpecName: "bundle") pod "22d0de6c-28c2-44d2-8aef-c5680cb681aa" (UID: "22d0de6c-28c2-44d2-8aef-c5680cb681aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.767817 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq" (OuterVolumeSpecName: "kube-api-access-78mdq") pod "22d0de6c-28c2-44d2-8aef-c5680cb681aa" (UID: "22d0de6c-28c2-44d2-8aef-c5680cb681aa"). InnerVolumeSpecName "kube-api-access-78mdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.793652 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util" (OuterVolumeSpecName: "util") pod "22d0de6c-28c2-44d2-8aef-c5680cb681aa" (UID: "22d0de6c-28c2-44d2-8aef-c5680cb681aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.859840 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mdq\" (UniqueName: \"kubernetes.io/projected/22d0de6c-28c2-44d2-8aef-c5680cb681aa-kube-api-access-78mdq\") on node \"crc\" DevicePath \"\"" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.859884 4731 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-util\") on node \"crc\" DevicePath \"\"" Dec 03 19:08:16 crc kubenswrapper[4731]: I1203 19:08:16.859897 4731 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22d0de6c-28c2-44d2-8aef-c5680cb681aa-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:08:17 crc kubenswrapper[4731]: I1203 19:08:17.412717 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" event={"ID":"22d0de6c-28c2-44d2-8aef-c5680cb681aa","Type":"ContainerDied","Data":"557cbdea650fa8a988504a76e5e27aecc08b4598210ae5f4b82a0ce3bb750b52"} Dec 03 19:08:17 crc kubenswrapper[4731]: I1203 19:08:17.412795 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="557cbdea650fa8a988504a76e5e27aecc08b4598210ae5f4b82a0ce3bb750b52" Dec 03 19:08:17 crc kubenswrapper[4731]: I1203 19:08:17.413439 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.515036 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92"] Dec 03 19:08:18 crc kubenswrapper[4731]: E1203 19:08:18.515592 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="extract" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.515606 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="extract" Dec 03 19:08:18 crc kubenswrapper[4731]: E1203 19:08:18.515617 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="util" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.515623 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="util" Dec 03 19:08:18 crc kubenswrapper[4731]: E1203 19:08:18.515638 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="pull" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.515644 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="pull" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.515761 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d0de6c-28c2-44d2-8aef-c5680cb681aa" containerName="extract" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.516310 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.522314 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-t4l88" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.550058 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92"] Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.687073 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5ch\" (UniqueName: \"kubernetes.io/projected/799e7213-a183-43a9-9d26-ff500765cdeb-kube-api-access-vr5ch\") pod \"openstack-operator-controller-operator-5984d69b9f-x9b92\" (UID: \"799e7213-a183-43a9-9d26-ff500765cdeb\") " pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.788898 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5ch\" (UniqueName: \"kubernetes.io/projected/799e7213-a183-43a9-9d26-ff500765cdeb-kube-api-access-vr5ch\") pod \"openstack-operator-controller-operator-5984d69b9f-x9b92\" (UID: \"799e7213-a183-43a9-9d26-ff500765cdeb\") " pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.807599 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5ch\" (UniqueName: \"kubernetes.io/projected/799e7213-a183-43a9-9d26-ff500765cdeb-kube-api-access-vr5ch\") pod \"openstack-operator-controller-operator-5984d69b9f-x9b92\" (UID: \"799e7213-a183-43a9-9d26-ff500765cdeb\") " pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:18 crc kubenswrapper[4731]: I1203 19:08:18.832452 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:19 crc kubenswrapper[4731]: I1203 19:08:19.271848 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92"] Dec 03 19:08:19 crc kubenswrapper[4731]: I1203 19:08:19.427735 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" event={"ID":"799e7213-a183-43a9-9d26-ff500765cdeb","Type":"ContainerStarted","Data":"2bc6886f749d0ccca58d4f19ba4f5ae89db5a4bfa2a9ea92f703f09c158e8f74"} Dec 03 19:08:25 crc kubenswrapper[4731]: I1203 19:08:25.473432 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" event={"ID":"799e7213-a183-43a9-9d26-ff500765cdeb","Type":"ContainerStarted","Data":"53439f9b046cc27846df3c79985b86732a26e9e4e2b713eb058aeed66e87c6b1"} Dec 03 19:08:25 crc kubenswrapper[4731]: I1203 19:08:25.474063 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:25 crc kubenswrapper[4731]: I1203 19:08:25.504306 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" podStartSLOduration=2.148349504 podStartE2EDuration="7.504283629s" podCreationTimestamp="2025-12-03 19:08:18 +0000 UTC" firstStartedPulling="2025-12-03 19:08:19.289905867 +0000 UTC m=+819.888500331" lastFinishedPulling="2025-12-03 19:08:24.645839992 +0000 UTC m=+825.244434456" observedRunningTime="2025-12-03 19:08:25.499650727 +0000 UTC m=+826.098245191" watchObservedRunningTime="2025-12-03 19:08:25.504283629 +0000 UTC m=+826.102878093" Dec 03 19:08:38 crc kubenswrapper[4731]: I1203 19:08:38.836743 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5984d69b9f-x9b92" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.262208 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.263770 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.288191 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.363271 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.363331 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx9f2\" (UniqueName: \"kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.363466 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.464896 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.464977 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.465026 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx9f2\" (UniqueName: \"kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.465499 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.465685 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.509329 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx9f2\" (UniqueName: \"kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2\") pod \"community-operators-p8dqn\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:46 crc kubenswrapper[4731]: I1203 19:08:46.579722 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:47 crc kubenswrapper[4731]: I1203 19:08:47.183573 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:08:47 crc kubenswrapper[4731]: I1203 19:08:47.631247 4731 generic.go:334] "Generic (PLEG): container finished" podID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerID="1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01" exitCode=0 Dec 03 19:08:47 crc kubenswrapper[4731]: I1203 19:08:47.631512 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerDied","Data":"1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01"} Dec 03 19:08:47 crc kubenswrapper[4731]: I1203 19:08:47.631592 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerStarted","Data":"e15a5a07d0df18952570ca8ee3dc5114e2aed1389190677a658dcdfa22eb6ac6"} Dec 03 19:08:49 crc kubenswrapper[4731]: I1203 19:08:49.644523 4731 generic.go:334] "Generic (PLEG): container finished" podID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerID="b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc" exitCode=0 Dec 03 19:08:49 crc kubenswrapper[4731]: I1203 19:08:49.644718 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerDied","Data":"b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc"} Dec 03 19:08:50 crc kubenswrapper[4731]: I1203 19:08:50.658729 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerStarted","Data":"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993"} Dec 03 19:08:50 crc kubenswrapper[4731]: I1203 19:08:50.681221 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p8dqn" podStartSLOduration=2.242871056 podStartE2EDuration="4.681193651s" podCreationTimestamp="2025-12-03 19:08:46 +0000 UTC" firstStartedPulling="2025-12-03 19:08:47.632821655 +0000 UTC m=+848.231416119" lastFinishedPulling="2025-12-03 19:08:50.07114425 +0000 UTC m=+850.669738714" observedRunningTime="2025-12-03 19:08:50.676919687 +0000 UTC m=+851.275514161" watchObservedRunningTime="2025-12-03 19:08:50.681193651 +0000 UTC m=+851.279788115" Dec 03 19:08:52 crc kubenswrapper[4731]: I1203 19:08:52.875704 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:08:52 crc kubenswrapper[4731]: I1203 19:08:52.882058 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:52 crc kubenswrapper[4731]: I1203 19:08:52.890335 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.077100 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vbw\" (UniqueName: \"kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.077191 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.077236 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.203400 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vbw\" (UniqueName: \"kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.203681 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.204269 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.204355 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.204677 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.236927 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vbw\" (UniqueName: \"kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw\") pod \"certified-operators-d7hw7\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.499941 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:08:53 crc kubenswrapper[4731]: I1203 19:08:53.813192 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:08:54 crc kubenswrapper[4731]: I1203 19:08:54.696819 4731 generic.go:334] "Generic (PLEG): container finished" podID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerID="4d662aaeb1e035c6fe2a26ce4e791bc0979592c3f03ba13af4653e445f8412b7" exitCode=0 Dec 03 19:08:54 crc kubenswrapper[4731]: I1203 19:08:54.696872 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerDied","Data":"4d662aaeb1e035c6fe2a26ce4e791bc0979592c3f03ba13af4653e445f8412b7"} Dec 03 19:08:54 crc kubenswrapper[4731]: I1203 19:08:54.696905 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerStarted","Data":"0a45e8eba0918952c38deb2af638840f5ef4be454087112e27415c1d56b694d8"} Dec 03 19:08:55 crc kubenswrapper[4731]: I1203 19:08:55.704794 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerStarted","Data":"6c98289d81fff0acb9ada96057ffb5bcf5a3df153fad6104b212522f19ea6c91"} Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.580298 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.580742 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.666970 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.712009 4731 generic.go:334] "Generic (PLEG): container finished" podID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerID="6c98289d81fff0acb9ada96057ffb5bcf5a3df153fad6104b212522f19ea6c91" exitCode=0 Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.712093 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerDied","Data":"6c98289d81fff0acb9ada96057ffb5bcf5a3df153fad6104b212522f19ea6c91"} Dec 03 19:08:56 crc kubenswrapper[4731]: I1203 19:08:56.765814 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.712687 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.715401 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.726089 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9kfqq" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.733865 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.735873 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.757608 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-59qvq" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.758203 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerStarted","Data":"f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90"} Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.759455 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.778928 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.780079 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.783688 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-88w29" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.797586 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9jj\" (UniqueName: \"kubernetes.io/projected/362b0743-3165-454e-93b9-b6713d26680b-kube-api-access-vp9jj\") pod \"cinder-operator-controller-manager-859b6ccc6-2zlnb\" (UID: \"362b0743-3165-454e-93b9-b6713d26680b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.797660 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7xgk\" (UniqueName: \"kubernetes.io/projected/f6d78107-9c18-4cca-afa8-a360f45c6bac-kube-api-access-v7xgk\") pod \"barbican-operator-controller-manager-7d9dfd778-rlbmk\" (UID: \"f6d78107-9c18-4cca-afa8-a360f45c6bac\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.814059 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.815545 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.817936 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-62vzw" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.819444 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.824210 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.839081 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.854610 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.856029 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.866468 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-k8khw" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.871760 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7hw7" podStartSLOduration=3.4422990110000002 podStartE2EDuration="5.871735795s" podCreationTimestamp="2025-12-03 19:08:52 +0000 UTC" firstStartedPulling="2025-12-03 19:08:54.698798111 +0000 UTC m=+855.297392575" lastFinishedPulling="2025-12-03 19:08:57.128234895 +0000 UTC m=+857.726829359" observedRunningTime="2025-12-03 19:08:57.824098679 +0000 UTC m=+858.422693143" watchObservedRunningTime="2025-12-03 19:08:57.871735795 +0000 UTC m=+858.470330259" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.895410 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.895449 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.896296 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.897008 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.897028 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.897109 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.897151 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.899509 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9jj\" (UniqueName: \"kubernetes.io/projected/362b0743-3165-454e-93b9-b6713d26680b-kube-api-access-vp9jj\") pod \"cinder-operator-controller-manager-859b6ccc6-2zlnb\" (UID: \"362b0743-3165-454e-93b9-b6713d26680b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.899549 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r42c6\" (UniqueName: \"kubernetes.io/projected/7f50e5ed-f1dc-4706-8a7c-0d7e85a06351-kube-api-access-r42c6\") pod \"glance-operator-controller-manager-77987cd8cd-bqv9t\" (UID: \"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.899616 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xgk\" (UniqueName: \"kubernetes.io/projected/f6d78107-9c18-4cca-afa8-a360f45c6bac-kube-api-access-v7xgk\") pod \"barbican-operator-controller-manager-7d9dfd778-rlbmk\" (UID: \"f6d78107-9c18-4cca-afa8-a360f45c6bac\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.899663 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966kw\" (UniqueName: \"kubernetes.io/projected/d8e13eee-1041-4fb9-b8d1-6169c42d5de3-kube-api-access-966kw\") pod \"designate-operator-controller-manager-78b4bc895b-2vmhb\" (UID: \"d8e13eee-1041-4fb9-b8d1-6169c42d5de3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.905609 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.907838 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-v68xz" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.908052 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7v5hj" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.913325 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.914710 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.923968 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6b6p9" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.931444 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.947321 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.948671 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.951194 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9jj\" (UniqueName: \"kubernetes.io/projected/362b0743-3165-454e-93b9-b6713d26680b-kube-api-access-vp9jj\") pod \"cinder-operator-controller-manager-859b6ccc6-2zlnb\" (UID: \"362b0743-3165-454e-93b9-b6713d26680b\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.953919 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-clvpt" Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.958409 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58"] Dec 03 19:08:57 crc kubenswrapper[4731]: I1203 19:08:57.978404 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xgk\" (UniqueName: \"kubernetes.io/projected/f6d78107-9c18-4cca-afa8-a360f45c6bac-kube-api-access-v7xgk\") pod \"barbican-operator-controller-manager-7d9dfd778-rlbmk\" (UID: \"f6d78107-9c18-4cca-afa8-a360f45c6bac\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.000906 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ljh\" (UniqueName: \"kubernetes.io/projected/b16dc4e0-2027-45dd-bee8-c5c5346e13f5-kube-api-access-g2ljh\") pod \"ironic-operator-controller-manager-6c548fd776-rpprb\" (UID: \"b16dc4e0-2027-45dd-bee8-c5c5346e13f5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.000974 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001029 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52n8h\" (UniqueName: \"kubernetes.io/projected/1c371c8f-7ede-4286-9d7c-6f65f1323237-kube-api-access-52n8h\") pod \"horizon-operator-controller-manager-68c6d99b8f-d22cz\" (UID: \"1c371c8f-7ede-4286-9d7c-6f65f1323237\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001073 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966kw\" (UniqueName: \"kubernetes.io/projected/d8e13eee-1041-4fb9-b8d1-6169c42d5de3-kube-api-access-966kw\") pod \"designate-operator-controller-manager-78b4bc895b-2vmhb\" (UID: \"d8e13eee-1041-4fb9-b8d1-6169c42d5de3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001239 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnj4\" (UniqueName: \"kubernetes.io/projected/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-kube-api-access-sbnj4\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001371 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r42c6\" (UniqueName: \"kubernetes.io/projected/7f50e5ed-f1dc-4706-8a7c-0d7e85a06351-kube-api-access-r42c6\") pod \"glance-operator-controller-manager-77987cd8cd-bqv9t\" (UID: \"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001437 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scglf\" (UniqueName: \"kubernetes.io/projected/dd42edc8-cfba-4913-8d87-7860ecef904f-kube-api-access-scglf\") pod \"keystone-operator-controller-manager-7765d96ddf-rns58\" (UID: \"dd42edc8-cfba-4913-8d87-7860ecef904f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.001475 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkcrn\" (UniqueName: \"kubernetes.io/projected/cf89e5f1-3460-42f6-b66f-6a556118cd30-kube-api-access-gkcrn\") pod \"heat-operator-controller-manager-5f64f6f8bb-w54rz\" (UID: \"cf89e5f1-3460-42f6-b66f-6a556118cd30\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.011057 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.013232 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.016471 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9vb8h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.027129 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.028239 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.039874 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zkjhv" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.047057 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966kw\" (UniqueName: \"kubernetes.io/projected/d8e13eee-1041-4fb9-b8d1-6169c42d5de3-kube-api-access-966kw\") pod \"designate-operator-controller-manager-78b4bc895b-2vmhb\" (UID: \"d8e13eee-1041-4fb9-b8d1-6169c42d5de3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.058178 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.058729 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.059683 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r42c6\" (UniqueName: \"kubernetes.io/projected/7f50e5ed-f1dc-4706-8a7c-0d7e85a06351-kube-api-access-r42c6\") pod \"glance-operator-controller-manager-77987cd8cd-bqv9t\" (UID: \"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.077324 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.079469 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.079480 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.084641 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5fzr7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.100749 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.103976 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ljh\" (UniqueName: \"kubernetes.io/projected/b16dc4e0-2027-45dd-bee8-c5c5346e13f5-kube-api-access-g2ljh\") pod \"ironic-operator-controller-manager-6c548fd776-rpprb\" (UID: \"b16dc4e0-2027-45dd-bee8-c5c5346e13f5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104022 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104058 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52n8h\" (UniqueName: \"kubernetes.io/projected/1c371c8f-7ede-4286-9d7c-6f65f1323237-kube-api-access-52n8h\") pod \"horizon-operator-controller-manager-68c6d99b8f-d22cz\" (UID: \"1c371c8f-7ede-4286-9d7c-6f65f1323237\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104098 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2t9\" (UniqueName: \"kubernetes.io/projected/3fe917e9-4872-4eb1-9bc4-744c81813123-kube-api-access-7l2t9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ncb9l\" (UID: \"3fe917e9-4872-4eb1-9bc4-744c81813123\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104128 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnj4\" (UniqueName: \"kubernetes.io/projected/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-kube-api-access-sbnj4\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104158 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scglf\" (UniqueName: \"kubernetes.io/projected/dd42edc8-cfba-4913-8d87-7860ecef904f-kube-api-access-scglf\") pod \"keystone-operator-controller-manager-7765d96ddf-rns58\" (UID: \"dd42edc8-cfba-4913-8d87-7860ecef904f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104177 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g55n\" (UniqueName: \"kubernetes.io/projected/c714457d-536a-48fe-8df4-758cff8fb22d-kube-api-access-7g55n\") pod \"manila-operator-controller-manager-7c79b5df47-h2ldk\" (UID: \"c714457d-536a-48fe-8df4-758cff8fb22d\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.104197 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkcrn\" (UniqueName: \"kubernetes.io/projected/cf89e5f1-3460-42f6-b66f-6a556118cd30-kube-api-access-gkcrn\") pod \"heat-operator-controller-manager-5f64f6f8bb-w54rz\" (UID: \"cf89e5f1-3460-42f6-b66f-6a556118cd30\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.104742 4731 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.104790 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert podName:8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:08:58.604773829 +0000 UTC m=+859.203368293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert") pod "infra-operator-controller-manager-777cfc666b-wx49m" (UID: "8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5") : secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.105364 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.136582 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.146862 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.147853 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scglf\" (UniqueName: \"kubernetes.io/projected/dd42edc8-cfba-4913-8d87-7860ecef904f-kube-api-access-scglf\") pod \"keystone-operator-controller-manager-7765d96ddf-rns58\" (UID: \"dd42edc8-cfba-4913-8d87-7860ecef904f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.150346 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52n8h\" (UniqueName: \"kubernetes.io/projected/1c371c8f-7ede-4286-9d7c-6f65f1323237-kube-api-access-52n8h\") pod \"horizon-operator-controller-manager-68c6d99b8f-d22cz\" (UID: \"1c371c8f-7ede-4286-9d7c-6f65f1323237\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.150739 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkcrn\" (UniqueName: \"kubernetes.io/projected/cf89e5f1-3460-42f6-b66f-6a556118cd30-kube-api-access-gkcrn\") pod \"heat-operator-controller-manager-5f64f6f8bb-w54rz\" (UID: \"cf89e5f1-3460-42f6-b66f-6a556118cd30\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.153106 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnj4\" (UniqueName: \"kubernetes.io/projected/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-kube-api-access-sbnj4\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.165023 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ljh\" (UniqueName: \"kubernetes.io/projected/b16dc4e0-2027-45dd-bee8-c5c5346e13f5-kube-api-access-g2ljh\") pod \"ironic-operator-controller-manager-6c548fd776-rpprb\" (UID: \"b16dc4e0-2027-45dd-bee8-c5c5346e13f5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.165104 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.166315 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.177092 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rrvgf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.187684 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.212890 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.219499 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.229215 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-lhtq7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.230490 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2t9\" (UniqueName: \"kubernetes.io/projected/3fe917e9-4872-4eb1-9bc4-744c81813123-kube-api-access-7l2t9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ncb9l\" (UID: \"3fe917e9-4872-4eb1-9bc4-744c81813123\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.230602 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxsw\" (UniqueName: \"kubernetes.io/projected/6085f8d0-d279-4997-a86a-3e539495c9d0-kube-api-access-kcxsw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-54m7x\" (UID: \"6085f8d0-d279-4997-a86a-3e539495c9d0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.230691 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g55n\" (UniqueName: \"kubernetes.io/projected/c714457d-536a-48fe-8df4-758cff8fb22d-kube-api-access-7g55n\") pod \"manila-operator-controller-manager-7c79b5df47-h2ldk\" (UID: \"c714457d-536a-48fe-8df4-758cff8fb22d\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.232568 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.232936 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.273815 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g55n\" (UniqueName: \"kubernetes.io/projected/c714457d-536a-48fe-8df4-758cff8fb22d-kube-api-access-7g55n\") pod \"manila-operator-controller-manager-7c79b5df47-h2ldk\" (UID: \"c714457d-536a-48fe-8df4-758cff8fb22d\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.285104 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2t9\" (UniqueName: \"kubernetes.io/projected/3fe917e9-4872-4eb1-9bc4-744c81813123-kube-api-access-7l2t9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ncb9l\" (UID: \"3fe917e9-4872-4eb1-9bc4-744c81813123\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.298353 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.300048 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.300651 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.312733 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.314681 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.323728 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bf9qf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.336809 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.338047 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxsw\" (UniqueName: \"kubernetes.io/projected/6085f8d0-d279-4997-a86a-3e539495c9d0-kube-api-access-kcxsw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-54m7x\" (UID: \"6085f8d0-d279-4997-a86a-3e539495c9d0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.338111 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kflw\" (UniqueName: \"kubernetes.io/projected/2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f-kube-api-access-9kflw\") pod \"octavia-operator-controller-manager-998648c74-ddmb5\" (UID: \"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.338151 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pgm\" (UniqueName: \"kubernetes.io/projected/37985ade-8410-4ecb-af0c-4d7bdd40608a-kube-api-access-x4pgm\") pod \"nova-operator-controller-manager-697bc559fc-85q9h\" (UID: \"37985ade-8410-4ecb-af0c-4d7bdd40608a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.375314 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxsw\" (UniqueName: \"kubernetes.io/projected/6085f8d0-d279-4997-a86a-3e539495c9d0-kube-api-access-kcxsw\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-54m7x\" (UID: \"6085f8d0-d279-4997-a86a-3e539495c9d0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.391151 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.392975 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.393713 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.401967 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-r5wwq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.403882 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.405350 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.411547 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n8v7k" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.428854 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.440607 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.440719 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kflw\" (UniqueName: \"kubernetes.io/projected/2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f-kube-api-access-9kflw\") pod \"octavia-operator-controller-manager-998648c74-ddmb5\" (UID: \"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.440749 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pgm\" (UniqueName: \"kubernetes.io/projected/37985ade-8410-4ecb-af0c-4d7bdd40608a-kube-api-access-x4pgm\") pod \"nova-operator-controller-manager-697bc559fc-85q9h\" (UID: \"37985ade-8410-4ecb-af0c-4d7bdd40608a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.440772 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvsg\" (UniqueName: \"kubernetes.io/projected/e625ea8d-55cc-4749-80f6-2848e064a6bf-kube-api-access-dkvsg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.451271 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.460405 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.470586 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.470906 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.474079 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.474220 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.476616 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-d84ls" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.483043 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.484147 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pgm\" (UniqueName: \"kubernetes.io/projected/37985ade-8410-4ecb-af0c-4d7bdd40608a-kube-api-access-x4pgm\") pod \"nova-operator-controller-manager-697bc559fc-85q9h\" (UID: \"37985ade-8410-4ecb-af0c-4d7bdd40608a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.496210 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.497484 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.499597 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2fxtt" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.505967 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kflw\" (UniqueName: \"kubernetes.io/projected/2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f-kube-api-access-9kflw\") pod \"octavia-operator-controller-manager-998648c74-ddmb5\" (UID: \"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.511549 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.527926 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-428v7"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.530494 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.534654 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-j47lq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.538277 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-428v7"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.548801 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.552895 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.555141 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.555323 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkvsg\" (UniqueName: \"kubernetes.io/projected/e625ea8d-55cc-4749-80f6-2848e064a6bf-kube-api-access-dkvsg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.555367 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvclw\" (UniqueName: \"kubernetes.io/projected/2b4f79c3-66c0-4c91-bfd1-bef243806900-kube-api-access-pvclw\") pod \"placement-operator-controller-manager-78f8948974-9b2cf\" (UID: \"2b4f79c3-66c0-4c91-bfd1-bef243806900\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.555391 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbcm\" (UniqueName: \"kubernetes.io/projected/2efe3c0e-8643-45f4-920e-17aa65157644-kube-api-access-zxbcm\") pod \"ovn-operator-controller-manager-b6456fdb6-z74rk\" (UID: \"2efe3c0e-8643-45f4-920e-17aa65157644\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.555607 4731 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.555667 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert podName:e625ea8d-55cc-4749-80f6-2848e064a6bf nodeName:}" failed. No retries permitted until 2025-12-03 19:08:59.0556458 +0000 UTC m=+859.654240274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" (UID: "e625ea8d-55cc-4749-80f6-2848e064a6bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.566851 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.575887 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-x8t4b" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.581507 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.600833 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkvsg\" (UniqueName: \"kubernetes.io/projected/e625ea8d-55cc-4749-80f6-2848e064a6bf-kube-api-access-dkvsg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.614482 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.660934 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.661797 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.661891 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664241 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nnnp\" (UniqueName: \"kubernetes.io/projected/e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3-kube-api-access-2nnnp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-qsgsq\" (UID: \"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664314 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9xs\" (UniqueName: \"kubernetes.io/projected/23b6d2e9-27bc-4944-ba47-a592981fa0d3-kube-api-access-nl9xs\") pod \"swift-operator-controller-manager-5f8c65bbfc-zlg7s\" (UID: \"23b6d2e9-27bc-4944-ba47-a592981fa0d3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664400 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvclw\" (UniqueName: \"kubernetes.io/projected/2b4f79c3-66c0-4c91-bfd1-bef243806900-kube-api-access-pvclw\") pod \"placement-operator-controller-manager-78f8948974-9b2cf\" (UID: \"2b4f79c3-66c0-4c91-bfd1-bef243806900\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664422 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbcm\" (UniqueName: \"kubernetes.io/projected/2efe3c0e-8643-45f4-920e-17aa65157644-kube-api-access-zxbcm\") pod \"ovn-operator-controller-manager-b6456fdb6-z74rk\" (UID: \"2efe3c0e-8643-45f4-920e-17aa65157644\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664466 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664487 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnskw\" (UniqueName: \"kubernetes.io/projected/699771e7-b57b-4435-95d9-964c90bbcc3f-kube-api-access-nnskw\") pod \"test-operator-controller-manager-5854674fcc-428v7\" (UID: \"699771e7-b57b-4435-95d9-964c90bbcc3f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664533 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mx6\" (UniqueName: \"kubernetes.io/projected/36852547-6431-45bc-b56e-6f8261334da2-kube-api-access-64mx6\") pod \"watcher-operator-controller-manager-769dc69bc-gt4xb\" (UID: \"36852547-6431-45bc-b56e-6f8261334da2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.664702 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.665538 4731 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.665580 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert podName:8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:08:59.665562844 +0000 UTC m=+860.264157308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert") pod "infra-operator-controller-manager-777cfc666b-wx49m" (UID: "8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5") : secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.670869 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-98ckb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.670992 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.688746 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.689821 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.706620 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-lf8t7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.707545 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.710222 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbcm\" (UniqueName: \"kubernetes.io/projected/2efe3c0e-8643-45f4-920e-17aa65157644-kube-api-access-zxbcm\") pod \"ovn-operator-controller-manager-b6456fdb6-z74rk\" (UID: \"2efe3c0e-8643-45f4-920e-17aa65157644\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.810705 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnskw\" (UniqueName: \"kubernetes.io/projected/699771e7-b57b-4435-95d9-964c90bbcc3f-kube-api-access-nnskw\") pod \"test-operator-controller-manager-5854674fcc-428v7\" (UID: \"699771e7-b57b-4435-95d9-964c90bbcc3f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.810832 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdrz\" (UniqueName: \"kubernetes.io/projected/771782b8-3be4-499f-96b9-5b862de7f654-kube-api-access-2mdrz\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.810901 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64mx6\" (UniqueName: \"kubernetes.io/projected/36852547-6431-45bc-b56e-6f8261334da2-kube-api-access-64mx6\") pod \"watcher-operator-controller-manager-769dc69bc-gt4xb\" (UID: \"36852547-6431-45bc-b56e-6f8261334da2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.811000 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nnnp\" (UniqueName: \"kubernetes.io/projected/e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3-kube-api-access-2nnnp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-qsgsq\" (UID: \"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.811038 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9xs\" (UniqueName: \"kubernetes.io/projected/23b6d2e9-27bc-4944-ba47-a592981fa0d3-kube-api-access-nl9xs\") pod \"swift-operator-controller-manager-5f8c65bbfc-zlg7s\" (UID: \"23b6d2e9-27bc-4944-ba47-a592981fa0d3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.811079 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.811428 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvclw\" (UniqueName: \"kubernetes.io/projected/2b4f79c3-66c0-4c91-bfd1-bef243806900-kube-api-access-pvclw\") pod \"placement-operator-controller-manager-78f8948974-9b2cf\" (UID: \"2b4f79c3-66c0-4c91-bfd1-bef243806900\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.839527 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.889111 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64mx6\" (UniqueName: \"kubernetes.io/projected/36852547-6431-45bc-b56e-6f8261334da2-kube-api-access-64mx6\") pod \"watcher-operator-controller-manager-769dc69bc-gt4xb\" (UID: \"36852547-6431-45bc-b56e-6f8261334da2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.889627 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nnnp\" (UniqueName: \"kubernetes.io/projected/e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3-kube-api-access-2nnnp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-qsgsq\" (UID: \"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.893205 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.893778 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9xs\" (UniqueName: \"kubernetes.io/projected/23b6d2e9-27bc-4944-ba47-a592981fa0d3-kube-api-access-nl9xs\") pod \"swift-operator-controller-manager-5f8c65bbfc-zlg7s\" (UID: \"23b6d2e9-27bc-4944-ba47-a592981fa0d3\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.900096 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnskw\" (UniqueName: \"kubernetes.io/projected/699771e7-b57b-4435-95d9-964c90bbcc3f-kube-api-access-nnskw\") pod \"test-operator-controller-manager-5854674fcc-428v7\" (UID: \"699771e7-b57b-4435-95d9-964c90bbcc3f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.906865 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.946824 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.954088 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdrz\" (UniqueName: \"kubernetes.io/projected/771782b8-3be4-499f-96b9-5b862de7f654-kube-api-access-2mdrz\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.954225 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.954296 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2cvg\" (UniqueName: \"kubernetes.io/projected/ce8b3502-d3bc-463d-b3a4-a758b6b42acb-kube-api-access-c2cvg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gwz5z\" (UID: \"ce8b3502-d3bc-463d-b3a4-a758b6b42acb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.954364 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.962338 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.962431 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:08:59.462409386 +0000 UTC m=+860.061003840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.962591 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: E1203 19:08:58.962651 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:08:59.462633863 +0000 UTC m=+860.061228327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.976500 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk"] Dec 03 19:08:58 crc kubenswrapper[4731]: I1203 19:08:58.980491 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.002568 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdrz\" (UniqueName: \"kubernetes.io/projected/771782b8-3be4-499f-96b9-5b862de7f654-kube-api-access-2mdrz\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.003428 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.028633 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.028900 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p8dqn" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="registry-server" containerID="cri-o://ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993" gracePeriod=2 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.058867 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2cvg\" (UniqueName: \"kubernetes.io/projected/ce8b3502-d3bc-463d-b3a4-a758b6b42acb-kube-api-access-c2cvg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gwz5z\" (UID: \"ce8b3502-d3bc-463d-b3a4-a758b6b42acb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.059007 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.059140 4731 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.059193 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert podName:e625ea8d-55cc-4749-80f6-2848e064a6bf nodeName:}" failed. No retries permitted until 2025-12-03 19:09:00.059174585 +0000 UTC m=+860.657769049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" (UID: "e625ea8d-55cc-4749-80f6-2848e064a6bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.083985 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2cvg\" (UniqueName: \"kubernetes.io/projected/ce8b3502-d3bc-463d-b3a4-a758b6b42acb-kube-api-access-c2cvg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gwz5z\" (UID: \"ce8b3502-d3bc-463d-b3a4-a758b6b42acb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.088824 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.115681 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.154648 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.443923 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb"] Dec 03 19:08:59 crc kubenswrapper[4731]: W1203 19:08:59.455285 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e13eee_1041_4fb9_b8d1_6169c42d5de3.slice/crio-151d283541481fbebd0480393118a04c4435b5fc4897f78ce10a5e2853a60c97 WatchSource:0}: Error finding container 151d283541481fbebd0480393118a04c4435b5fc4897f78ce10a5e2853a60c97: Status 404 returned error can't find the container with id 151d283541481fbebd0480393118a04c4435b5fc4897f78ce10a5e2853a60c97 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.468971 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.469108 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.469232 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.469297 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:00.469281996 +0000 UTC m=+861.067876460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.469387 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.469462 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:00.469439071 +0000 UTC m=+861.068033525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.678374 4731 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: E1203 19:08:59.678797 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert podName:8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:01.678778957 +0000 UTC m=+862.277373421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert") pod "infra-operator-controller-manager-777cfc666b-wx49m" (UID: "8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5") : secret "infra-operator-webhook-server-cert" not found Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.679389 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.785023 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.819356 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.906585 4731 generic.go:334] "Generic (PLEG): container finished" podID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerID="ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993" exitCode=0 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.906688 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8dqn" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907802 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907830 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907842 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" event={"ID":"d8e13eee-1041-4fb9-b8d1-6169c42d5de3","Type":"ContainerStarted","Data":"151d283541481fbebd0480393118a04c4435b5fc4897f78ce10a5e2853a60c97"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907863 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" event={"ID":"362b0743-3165-454e-93b9-b6713d26680b","Type":"ContainerStarted","Data":"dd3b910cab94dd03dfff15e6f1e50a614b6350eeaa00cbbabd3e5c9b554135b3"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907874 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" event={"ID":"1c371c8f-7ede-4286-9d7c-6f65f1323237","Type":"ContainerStarted","Data":"cabdb40f6b5c13b2b328cfa5ebc141e2bd48e4e2c148ed350640987e08e6f18b"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907886 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" event={"ID":"f6d78107-9c18-4cca-afa8-a360f45c6bac","Type":"ContainerStarted","Data":"3585f59297821c3ad33f17c3e62f2f6b142bcc95590a0acca6081b300404d98e"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907896 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerDied","Data":"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907909 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8dqn" event={"ID":"2720d9d3-677f-4aff-bc9c-f669bff86f60","Type":"ContainerDied","Data":"e15a5a07d0df18952570ca8ee3dc5114e2aed1389190677a658dcdfa22eb6ac6"} Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.907929 4731 scope.go:117] "RemoveContainer" containerID="ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993" Dec 03 19:08:59 crc kubenswrapper[4731]: W1203 19:08:59.908296 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16dc4e0_2027_45dd_bee8_c5c5346e13f5.slice/crio-b85dea3fc0e2082746588490540a22c91af6d3b24b5146c05806e31dc0b51e37 WatchSource:0}: Error finding container b85dea3fc0e2082746588490540a22c91af6d3b24b5146c05806e31dc0b51e37: Status 404 returned error can't find the container with id b85dea3fc0e2082746588490540a22c91af6d3b24b5146c05806e31dc0b51e37 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.915449 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t"] Dec 03 19:08:59 crc kubenswrapper[4731]: W1203 19:08:59.924733 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37985ade_8410_4ecb_af0c_4d7bdd40608a.slice/crio-23ebf00dc60aaffdb02accc92b1eabacebf5ca047957a3f2de628dbb58c8cf88 WatchSource:0}: Error finding container 23ebf00dc60aaffdb02accc92b1eabacebf5ca047957a3f2de628dbb58c8cf88: Status 404 returned error can't find the container with id 23ebf00dc60aaffdb02accc92b1eabacebf5ca047957a3f2de628dbb58c8cf88 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.943115 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.956181 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.957224 4731 scope.go:117] "RemoveContainer" containerID="b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc" Dec 03 19:08:59 crc kubenswrapper[4731]: W1203 19:08:59.964529 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe917e9_4872_4eb1_9bc4_744c81813123.slice/crio-5331129dc4474efb4ffa461f3d7642c2306f76691753036dd968c7b1d3b72484 WatchSource:0}: Error finding container 5331129dc4474efb4ffa461f3d7642c2306f76691753036dd968c7b1d3b72484: Status 404 returned error can't find the container with id 5331129dc4474efb4ffa461f3d7642c2306f76691753036dd968c7b1d3b72484 Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.971544 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.983937 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h"] Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.984369 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content\") pod \"2720d9d3-677f-4aff-bc9c-f669bff86f60\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.984488 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx9f2\" (UniqueName: \"kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2\") pod \"2720d9d3-677f-4aff-bc9c-f669bff86f60\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.984592 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities\") pod \"2720d9d3-677f-4aff-bc9c-f669bff86f60\" (UID: \"2720d9d3-677f-4aff-bc9c-f669bff86f60\") " Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.986651 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities" (OuterVolumeSpecName: "utilities") pod "2720d9d3-677f-4aff-bc9c-f669bff86f60" (UID: "2720d9d3-677f-4aff-bc9c-f669bff86f60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.991134 4731 scope.go:117] "RemoveContainer" containerID="1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.993145 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2" (OuterVolumeSpecName: "kube-api-access-cx9f2") pod "2720d9d3-677f-4aff-bc9c-f669bff86f60" (UID: "2720d9d3-677f-4aff-bc9c-f669bff86f60"). InnerVolumeSpecName "kube-api-access-cx9f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:08:59 crc kubenswrapper[4731]: I1203 19:08:59.996538 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.002620 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.017335 4731 scope.go:117] "RemoveContainer" containerID="ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.017829 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993\": container with ID starting with ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993 not found: ID does not exist" containerID="ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.017868 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993"} err="failed to get container status \"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993\": rpc error: code = NotFound desc = could not find container \"ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993\": container with ID starting with ad1e40e8f8eb66ec6c4aad1ad9e65eb18374ffb4a0997ea53389684f42ae6993 not found: ID does not exist" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.017895 4731 scope.go:117] "RemoveContainer" containerID="b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.018485 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc\": container with ID starting with b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc not found: ID does not exist" containerID="b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.018509 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc"} err="failed to get container status \"b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc\": rpc error: code = NotFound desc = could not find container \"b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc\": container with ID starting with b7c819781f0073ed432577c246c92d55c41af98a613e5891ce4a3556c4047ddc not found: ID does not exist" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.018524 4731 scope.go:117] "RemoveContainer" containerID="1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.020204 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01\": container with ID starting with 1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01 not found: ID does not exist" containerID="1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.020230 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01"} err="failed to get container status \"1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01\": rpc error: code = NotFound desc = could not find container \"1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01\": container with ID starting with 1a103750b5213fc59941b9aba21542ff75ef311fcd46fd66a313ea2285212d01 not found: ID does not exist" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.042072 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2720d9d3-677f-4aff-bc9c-f669bff86f60" (UID: "2720d9d3-677f-4aff-bc9c-f669bff86f60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.087801 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.088168 4731 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.088371 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert podName:e625ea8d-55cc-4749-80f6-2848e064a6bf nodeName:}" failed. No retries permitted until 2025-12-03 19:09:02.088329292 +0000 UTC m=+862.686923756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" (UID: "e625ea8d-55cc-4749-80f6-2848e064a6bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.088466 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.088483 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx9f2\" (UniqueName: \"kubernetes.io/projected/2720d9d3-677f-4aff-bc9c-f669bff86f60-kube-api-access-cx9f2\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.088512 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2720d9d3-677f-4aff-bc9c-f669bff86f60-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.136118 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.170541 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.186705 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.188041 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb"] Dec 03 19:09:00 crc kubenswrapper[4731]: W1203 19:09:00.199358 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36852547_6431_45bc_b56e_6f8261334da2.slice/crio-c0401baac1059403a5cca636981ad75236b099da45d41dcb3cc105b527c6f128 WatchSource:0}: Error finding container c0401baac1059403a5cca636981ad75236b099da45d41dcb3cc105b527c6f128: Status 404 returned error can't find the container with id c0401baac1059403a5cca636981ad75236b099da45d41dcb3cc105b527c6f128 Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.209033 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-428v7"] Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.210699 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-64mx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-gt4xb_openstack-operators(36852547-6431-45bc-b56e-6f8261334da2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.210840 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nnskw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-428v7_openstack-operators(699771e7-b57b-4435-95d9-964c90bbcc3f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.211839 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zxbcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-z74rk_openstack-operators(2efe3c0e-8643-45f4-920e-17aa65157644): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.212611 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-64mx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-gt4xb_openstack-operators(36852547-6431-45bc-b56e-6f8261334da2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.213768 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" podUID="36852547-6431-45bc-b56e-6f8261334da2" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.213992 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zxbcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-z74rk_openstack-operators(2efe3c0e-8643-45f4-920e-17aa65157644): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.215972 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" podUID="2efe3c0e-8643-45f4-920e-17aa65157644" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.217490 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nnskw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-428v7_openstack-operators(699771e7-b57b-4435-95d9-964c90bbcc3f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.219025 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" podUID="699771e7-b57b-4435-95d9-964c90bbcc3f" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.254643 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.261842 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p8dqn"] Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.294454 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq"] Dec 03 19:09:00 crc kubenswrapper[4731]: W1203 19:09:00.296657 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63db4fb_b3e3_45bb_bd2d_d5af9574d8a3.slice/crio-f72183a2faeb84ce17413ee40e015d44bda540d14cd185fc28fa364273c0ff81 WatchSource:0}: Error finding container f72183a2faeb84ce17413ee40e015d44bda540d14cd185fc28fa364273c0ff81: Status 404 returned error can't find the container with id f72183a2faeb84ce17413ee40e015d44bda540d14cd185fc28fa364273c0ff81 Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.306637 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nnnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-qsgsq_openstack-operators(e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.313738 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nnnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-qsgsq_openstack-operators(e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.314947 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" podUID="e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.324947 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s"] Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.332509 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl9xs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-zlg7s_openstack-operators(23b6d2e9-27bc-4944-ba47-a592981fa0d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.334142 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl9xs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-zlg7s_openstack-operators(23b6d2e9-27bc-4944-ba47-a592981fa0d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.335473 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" podUID="23b6d2e9-27bc-4944-ba47-a592981fa0d3" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.495457 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.495583 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.495625 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.495702 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:02.495680596 +0000 UTC m=+863.094275060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.495705 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.495741 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:02.495731598 +0000 UTC m=+863.094326062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.926884 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" event={"ID":"699771e7-b57b-4435-95d9-964c90bbcc3f","Type":"ContainerStarted","Data":"d950aabcb14fdd90bb48821ef2524a4fee156c1aa21fbc44215bf560e007a4d3"} Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.931033 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" podUID="699771e7-b57b-4435-95d9-964c90bbcc3f" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.933736 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" event={"ID":"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3","Type":"ContainerStarted","Data":"f72183a2faeb84ce17413ee40e015d44bda540d14cd185fc28fa364273c0ff81"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.937238 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" event={"ID":"23b6d2e9-27bc-4944-ba47-a592981fa0d3","Type":"ContainerStarted","Data":"b9eb65b98135b0a73e41d34dc9030d296562fa88c67f66cd9593a536d56a03fc"} Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.943905 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" podUID="e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3" Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.947408 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" podUID="23b6d2e9-27bc-4944-ba47-a592981fa0d3" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.948361 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" event={"ID":"b16dc4e0-2027-45dd-bee8-c5c5346e13f5","Type":"ContainerStarted","Data":"b85dea3fc0e2082746588490540a22c91af6d3b24b5146c05806e31dc0b51e37"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.953366 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" event={"ID":"37985ade-8410-4ecb-af0c-4d7bdd40608a","Type":"ContainerStarted","Data":"23ebf00dc60aaffdb02accc92b1eabacebf5ca047957a3f2de628dbb58c8cf88"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.961554 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" event={"ID":"2b4f79c3-66c0-4c91-bfd1-bef243806900","Type":"ContainerStarted","Data":"428e15883cdc469d24e7e81395ead32bb9d32a3eb7846a05dfc800f69fb2c9b8"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.966911 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" event={"ID":"36852547-6431-45bc-b56e-6f8261334da2","Type":"ContainerStarted","Data":"c0401baac1059403a5cca636981ad75236b099da45d41dcb3cc105b527c6f128"} Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.973382 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" podUID="36852547-6431-45bc-b56e-6f8261334da2" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.973894 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" event={"ID":"ce8b3502-d3bc-463d-b3a4-a758b6b42acb","Type":"ContainerStarted","Data":"7dd3605fb8272ae2655410c5357a43f795c4142ff2d253842d3c3d7d954c0322"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.979627 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" event={"ID":"2efe3c0e-8643-45f4-920e-17aa65157644","Type":"ContainerStarted","Data":"961678f7fe7868fb94c407479499f48a9f8f115d7ea3f65c914f3902642e05a5"} Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.982783 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" event={"ID":"c714457d-536a-48fe-8df4-758cff8fb22d","Type":"ContainerStarted","Data":"30790fd43f1570d9ca48b89c6e14316db694ddca1fde0c27117aa1575e67f5a7"} Dec 03 19:09:00 crc kubenswrapper[4731]: E1203 19:09:00.985725 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" podUID="2efe3c0e-8643-45f4-920e-17aa65157644" Dec 03 19:09:00 crc kubenswrapper[4731]: I1203 19:09:00.990335 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" event={"ID":"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f","Type":"ContainerStarted","Data":"a9338c48b485e623d368b0c05e9de1097f978537711874e8899f5cd8abd7a7e9"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.001039 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" event={"ID":"dd42edc8-cfba-4913-8d87-7860ecef904f","Type":"ContainerStarted","Data":"4ad31d2c0f9736bd6a9cf79f8a9969386513a1765c421cd4a14ad6659309dfc0"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.004928 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" event={"ID":"6085f8d0-d279-4997-a86a-3e539495c9d0","Type":"ContainerStarted","Data":"9ec230c1cc5f94d61d658e9a590ebf919487e00652478fcfa8fa86678c9765fd"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.021817 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" event={"ID":"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351","Type":"ContainerStarted","Data":"3ac33ba80e426fdc8f9e55ec7095d6cc3cf0d4a5edb6358a9d93e254b44c513a"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.031372 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" event={"ID":"cf89e5f1-3460-42f6-b66f-6a556118cd30","Type":"ContainerStarted","Data":"ae79d6c93cc6b9aaed54eb5991e90d426fe7fd0ff188c395c5a1d6717775af5b"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.041456 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" event={"ID":"3fe917e9-4872-4eb1-9bc4-744c81813123","Type":"ContainerStarted","Data":"5331129dc4474efb4ffa461f3d7642c2306f76691753036dd968c7b1d3b72484"} Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.713824 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:09:01 crc kubenswrapper[4731]: E1203 19:09:01.714206 4731 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 19:09:01 crc kubenswrapper[4731]: E1203 19:09:01.714314 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert podName:8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:05.714291062 +0000 UTC m=+866.312885526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert") pod "infra-operator-controller-manager-777cfc666b-wx49m" (UID: "8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5") : secret "infra-operator-webhook-server-cert" not found Dec 03 19:09:01 crc kubenswrapper[4731]: I1203 19:09:01.867921 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" path="/var/lib/kubelet/pods/2720d9d3-677f-4aff-bc9c-f669bff86f60/volumes" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.093681 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" podUID="2efe3c0e-8643-45f4-920e-17aa65157644" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.093773 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" podUID="23b6d2e9-27bc-4944-ba47-a592981fa0d3" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.095202 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" podUID="36852547-6431-45bc-b56e-6f8261334da2" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.095283 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" podUID="e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.095751 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" podUID="699771e7-b57b-4435-95d9-964c90bbcc3f" Dec 03 19:09:02 crc kubenswrapper[4731]: I1203 19:09:02.138882 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.139887 4731 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.139963 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert podName:e625ea8d-55cc-4749-80f6-2848e064a6bf nodeName:}" failed. No retries permitted until 2025-12-03 19:09:06.139936085 +0000 UTC m=+866.738530549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" (UID: "e625ea8d-55cc-4749-80f6-2848e064a6bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:02 crc kubenswrapper[4731]: I1203 19:09:02.544793 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:02 crc kubenswrapper[4731]: I1203 19:09:02.544875 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.545030 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.545031 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.545094 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:06.545076639 +0000 UTC m=+867.143671103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:09:02 crc kubenswrapper[4731]: E1203 19:09:02.545113 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:06.54510736 +0000 UTC m=+867.143701824 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.033461 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:03 crc kubenswrapper[4731]: E1203 19:09:03.033850 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="extract-utilities" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.033869 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="extract-utilities" Dec 03 19:09:03 crc kubenswrapper[4731]: E1203 19:09:03.033885 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="extract-content" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.033895 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="extract-content" Dec 03 19:09:03 crc kubenswrapper[4731]: E1203 19:09:03.033906 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="registry-server" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.033912 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="registry-server" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.034074 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2720d9d3-677f-4aff-bc9c-f669bff86f60" containerName="registry-server" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.035343 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.050997 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.056234 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.056306 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.056398 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwrz\" (UniqueName: \"kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.157685 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.157778 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwrz\" (UniqueName: \"kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.157861 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.158209 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.158366 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.197014 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwrz\" (UniqueName: \"kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz\") pod \"redhat-marketplace-fsx4k\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.359783 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.501155 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.501231 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:03 crc kubenswrapper[4731]: I1203 19:09:03.559662 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:04 crc kubenswrapper[4731]: I1203 19:09:04.188430 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:05 crc kubenswrapper[4731]: I1203 19:09:05.804310 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:09:05 crc kubenswrapper[4731]: E1203 19:09:05.804555 4731 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 19:09:05 crc kubenswrapper[4731]: E1203 19:09:05.804960 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert podName:8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:13.8049282 +0000 UTC m=+874.403522704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert") pod "infra-operator-controller-manager-777cfc666b-wx49m" (UID: "8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5") : secret "infra-operator-webhook-server-cert" not found Dec 03 19:09:05 crc kubenswrapper[4731]: I1203 19:09:05.830722 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:09:06 crc kubenswrapper[4731]: I1203 19:09:06.115154 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7hw7" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="registry-server" containerID="cri-o://f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" gracePeriod=2 Dec 03 19:09:06 crc kubenswrapper[4731]: I1203 19:09:06.210125 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.210321 4731 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.210414 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert podName:e625ea8d-55cc-4749-80f6-2848e064a6bf nodeName:}" failed. No retries permitted until 2025-12-03 19:09:14.210389905 +0000 UTC m=+874.808984369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" (UID: "e625ea8d-55cc-4749-80f6-2848e064a6bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 19:09:06 crc kubenswrapper[4731]: I1203 19:09:06.615606 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:06 crc kubenswrapper[4731]: I1203 19:09:06.615738 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.615844 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.615914 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.615940 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:14.615915001 +0000 UTC m=+875.214509465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:09:06 crc kubenswrapper[4731]: E1203 19:09:06.615985 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:14.615961443 +0000 UTC m=+875.214556017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:09:07 crc kubenswrapper[4731]: I1203 19:09:07.123334 4731 generic.go:334] "Generic (PLEG): container finished" podID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerID="f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" exitCode=0 Dec 03 19:09:07 crc kubenswrapper[4731]: I1203 19:09:07.123384 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerDied","Data":"f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90"} Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.049537 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.050418 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7l2t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-ncb9l_openstack-operators(3fe917e9-4872-4eb1-9bc4-744c81813123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.501075 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90 is running failed: container process not found" containerID="f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.501833 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90 is running failed: container process not found" containerID="f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.502526 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90 is running failed: container process not found" containerID="f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.502582 4731 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-d7hw7" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="registry-server" Dec 03 19:09:13 crc kubenswrapper[4731]: I1203 19:09:13.839039 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:09:13 crc kubenswrapper[4731]: I1203 19:09:13.848943 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5-cert\") pod \"infra-operator-controller-manager-777cfc666b-wx49m\" (UID: \"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5\") " pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.994105 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 03 19:09:13 crc kubenswrapper[4731]: E1203 19:09:13.994359 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7xgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-rlbmk_openstack-operators(f6d78107-9c18-4cca-afa8-a360f45c6bac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.140286 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.245962 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.249206 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e625ea8d-55cc-4749-80f6-2848e064a6bf-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq\" (UID: \"e625ea8d-55cc-4749-80f6-2848e064a6bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.535521 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.652684 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:14 crc kubenswrapper[4731]: I1203 19:09:14.652805 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.652910 4731 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.652947 4731 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.652993 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:30.652972219 +0000 UTC m=+891.251566683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "metrics-server-cert" not found Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.653012 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs podName:771782b8-3be4-499f-96b9-5b862de7f654 nodeName:}" failed. No retries permitted until 2025-12-03 19:09:30.65300611 +0000 UTC m=+891.251600574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs") pod "openstack-operator-controller-manager-5d8f48999-r8rg8" (UID: "771782b8-3be4-499f-96b9-5b862de7f654") : secret "webhook-server-cert" not found Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.690310 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 03 19:09:14 crc kubenswrapper[4731]: E1203 19:09:14.690923 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcxsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-54m7x_openstack-operators(6085f8d0-d279-4997-a86a-3e539495c9d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:15 crc kubenswrapper[4731]: E1203 19:09:15.334024 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 03 19:09:15 crc kubenswrapper[4731]: E1203 19:09:15.334405 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gkcrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-w54rz_openstack-operators(cf89e5f1-3460-42f6-b66f-6a556118cd30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:16 crc kubenswrapper[4731]: E1203 19:09:16.065368 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 03 19:09:16 crc kubenswrapper[4731]: E1203 19:09:16.065723 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-966kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-2vmhb_openstack-operators(d8e13eee-1041-4fb9-b8d1-6169c42d5de3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:16 crc kubenswrapper[4731]: E1203 19:09:16.841636 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 03 19:09:16 crc kubenswrapper[4731]: E1203 19:09:16.842194 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9kflw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-ddmb5_openstack-operators(2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:17 crc kubenswrapper[4731]: E1203 19:09:17.783790 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 03 19:09:17 crc kubenswrapper[4731]: E1203 19:09:17.783993 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r42c6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-bqv9t_openstack-operators(7f50e5ed-f1dc-4706-8a7c-0d7e85a06351): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:19 crc kubenswrapper[4731]: E1203 19:09:19.931565 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 03 19:09:19 crc kubenswrapper[4731]: E1203 19:09:19.932359 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2ljh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-rpprb_openstack-operators(b16dc4e0-2027-45dd-bee8-c5c5346e13f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:20 crc kubenswrapper[4731]: E1203 19:09:20.727629 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 03 19:09:20 crc kubenswrapper[4731]: E1203 19:09:20.727988 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c2cvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gwz5z_openstack-operators(ce8b3502-d3bc-463d-b3a4-a758b6b42acb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:20 crc kubenswrapper[4731]: E1203 19:09:20.729165 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" podUID="ce8b3502-d3bc-463d-b3a4-a758b6b42acb" Dec 03 19:09:21 crc kubenswrapper[4731]: E1203 19:09:21.235671 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" podUID="ce8b3502-d3bc-463d-b3a4-a758b6b42acb" Dec 03 19:09:22 crc kubenswrapper[4731]: E1203 19:09:22.576297 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 19:09:22 crc kubenswrapper[4731]: E1203 19:09:22.577117 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-scglf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-rns58_openstack-operators(dd42edc8-cfba-4913-8d87-7860ecef904f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:23 crc kubenswrapper[4731]: E1203 19:09:23.085865 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 19:09:23 crc kubenswrapper[4731]: E1203 19:09:23.086085 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4pgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-85q9h_openstack-operators(37985ade-8410-4ecb-af0c-4d7bdd40608a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.115723 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.250205 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7hw7" event={"ID":"564cdd3d-bea0-4d91-b1d3-b483b550209d","Type":"ContainerDied","Data":"0a45e8eba0918952c38deb2af638840f5ef4be454087112e27415c1d56b694d8"} Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.250341 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7hw7" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.250598 4731 scope.go:117] "RemoveContainer" containerID="f9cd194d6d0b6c3a7f33eaf657e071b85d84c6cca21750571959b5d6f90cce90" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.261232 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities\") pod \"564cdd3d-bea0-4d91-b1d3-b483b550209d\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.261356 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content\") pod \"564cdd3d-bea0-4d91-b1d3-b483b550209d\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.261429 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6vbw\" (UniqueName: \"kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw\") pod \"564cdd3d-bea0-4d91-b1d3-b483b550209d\" (UID: \"564cdd3d-bea0-4d91-b1d3-b483b550209d\") " Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.262598 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities" (OuterVolumeSpecName: "utilities") pod "564cdd3d-bea0-4d91-b1d3-b483b550209d" (UID: "564cdd3d-bea0-4d91-b1d3-b483b550209d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.269212 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw" (OuterVolumeSpecName: "kube-api-access-b6vbw") pod "564cdd3d-bea0-4d91-b1d3-b483b550209d" (UID: "564cdd3d-bea0-4d91-b1d3-b483b550209d"). InnerVolumeSpecName "kube-api-access-b6vbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.310801 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "564cdd3d-bea0-4d91-b1d3-b483b550209d" (UID: "564cdd3d-bea0-4d91-b1d3-b483b550209d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.363513 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.363556 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564cdd3d-bea0-4d91-b1d3-b483b550209d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.363571 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6vbw\" (UniqueName: \"kubernetes.io/projected/564cdd3d-bea0-4d91-b1d3-b483b550209d-kube-api-access-b6vbw\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.584497 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.592971 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7hw7"] Dec 03 19:09:23 crc kubenswrapper[4731]: I1203 19:09:23.864805 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" path="/var/lib/kubelet/pods/564cdd3d-bea0-4d91-b1d3-b483b550209d/volumes" Dec 03 19:09:24 crc kubenswrapper[4731]: I1203 19:09:24.034944 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m"] Dec 03 19:09:26 crc kubenswrapper[4731]: I1203 19:09:26.468714 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:09:26 crc kubenswrapper[4731]: I1203 19:09:26.469414 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:09:27 crc kubenswrapper[4731]: I1203 19:09:27.286515 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" event={"ID":"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5","Type":"ContainerStarted","Data":"ee94ca8791237011f11381e842a177d718aa9d323304269f7bf09923d39165d0"} Dec 03 19:09:27 crc kubenswrapper[4731]: I1203 19:09:27.493292 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq"] Dec 03 19:09:27 crc kubenswrapper[4731]: I1203 19:09:27.539072 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:27 crc kubenswrapper[4731]: I1203 19:09:27.593283 4731 scope.go:117] "RemoveContainer" containerID="6c98289d81fff0acb9ada96057ffb5bcf5a3df153fad6104b212522f19ea6c91" Dec 03 19:09:27 crc kubenswrapper[4731]: W1203 19:09:27.610855 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode625ea8d_55cc_4749_80f6_2848e064a6bf.slice/crio-135018ddc86f880d69a09468d4cf42bb90db4ab907f6ffe831ab5be1011f30ff WatchSource:0}: Error finding container 135018ddc86f880d69a09468d4cf42bb90db4ab907f6ffe831ab5be1011f30ff: Status 404 returned error can't find the container with id 135018ddc86f880d69a09468d4cf42bb90db4ab907f6ffe831ab5be1011f30ff Dec 03 19:09:27 crc kubenswrapper[4731]: W1203 19:09:27.614653 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbba6fc51_4a48_407c_a78f_fba285b49684.slice/crio-76c74a8caf8a7e5b40476223e6cbabed5c66987a47a2911fba0dee770e33a302 WatchSource:0}: Error finding container 76c74a8caf8a7e5b40476223e6cbabed5c66987a47a2911fba0dee770e33a302: Status 404 returned error can't find the container with id 76c74a8caf8a7e5b40476223e6cbabed5c66987a47a2911fba0dee770e33a302 Dec 03 19:09:27 crc kubenswrapper[4731]: I1203 19:09:27.734085 4731 scope.go:117] "RemoveContainer" containerID="4d662aaeb1e035c6fe2a26ce4e791bc0979592c3f03ba13af4653e445f8412b7" Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.326132 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerStarted","Data":"76c74a8caf8a7e5b40476223e6cbabed5c66987a47a2911fba0dee770e33a302"} Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.338384 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" event={"ID":"362b0743-3165-454e-93b9-b6713d26680b","Type":"ContainerStarted","Data":"6f12230b0b401d9f6bd73e68e95ade323ecf8ec050a53818e0dbfb6429fd059b"} Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.340541 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" event={"ID":"c714457d-536a-48fe-8df4-758cff8fb22d","Type":"ContainerStarted","Data":"ad14df085919c953a8a06512b9ac3dc3d199d318912246a2c07371d43fe43810"} Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.345135 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" event={"ID":"1c371c8f-7ede-4286-9d7c-6f65f1323237","Type":"ContainerStarted","Data":"fa6ec1a7b18869953287d6fd8149d6ef884e28aba7d5100e12b483fd0503ced4"} Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.346392 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" event={"ID":"e625ea8d-55cc-4749-80f6-2848e064a6bf","Type":"ContainerStarted","Data":"135018ddc86f880d69a09468d4cf42bb90db4ab907f6ffe831ab5be1011f30ff"} Dec 03 19:09:28 crc kubenswrapper[4731]: I1203 19:09:28.347909 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" event={"ID":"2b4f79c3-66c0-4c91-bfd1-bef243806900","Type":"ContainerStarted","Data":"c006376c129e37ed2fa3515bf9a8f6de29d4b943f4847af8c516115674154498"} Dec 03 19:09:29 crc kubenswrapper[4731]: I1203 19:09:29.356836 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" event={"ID":"699771e7-b57b-4435-95d9-964c90bbcc3f","Type":"ContainerStarted","Data":"648ccda2e27a15340d4b7e1eb473bc432217753cb4268fe853ea0edc037e9e2d"} Dec 03 19:09:29 crc kubenswrapper[4731]: I1203 19:09:29.360245 4731 generic.go:334] "Generic (PLEG): container finished" podID="bba6fc51-4a48-407c-a78f-fba285b49684" containerID="f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4" exitCode=0 Dec 03 19:09:29 crc kubenswrapper[4731]: I1203 19:09:29.360300 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerDied","Data":"f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4"} Dec 03 19:09:30 crc kubenswrapper[4731]: I1203 19:09:30.730441 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:30 crc kubenswrapper[4731]: I1203 19:09:30.730583 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:30 crc kubenswrapper[4731]: I1203 19:09:30.737745 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-webhook-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:30 crc kubenswrapper[4731]: I1203 19:09:30.743385 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/771782b8-3be4-499f-96b9-5b862de7f654-metrics-certs\") pod \"openstack-operator-controller-manager-5d8f48999-r8rg8\" (UID: \"771782b8-3be4-499f-96b9-5b862de7f654\") " pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:30 crc kubenswrapper[4731]: I1203 19:09:30.786889 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:37 crc kubenswrapper[4731]: I1203 19:09:37.559848 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:09:40 crc kubenswrapper[4731]: I1203 19:09:40.451540 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" event={"ID":"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3","Type":"ContainerStarted","Data":"0d5dc98a3e3f49b2a0960672ec00e7d1be57223a780ff1663ac7be0f7ec47762"} Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.467642 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.467872 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r42c6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-bqv9t_openstack-operators(7f50e5ed-f1dc-4706-8a7c-0d7e85a06351): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.469082 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" podUID="7f50e5ed-f1dc-4706-8a7c-0d7e85a06351" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.474691 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.474825 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7xgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-rlbmk_openstack-operators(f6d78107-9c18-4cca-afa8-a360f45c6bac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.475984 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" podUID="f6d78107-9c18-4cca-afa8-a360f45c6bac" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.504490 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.504711 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gkcrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-w54rz_openstack-operators(cf89e5f1-3460-42f6-b66f-6a556118cd30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.505942 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" podUID="cf89e5f1-3460-42f6-b66f-6a556118cd30" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.508018 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.508223 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2ljh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-rpprb_openstack-operators(b16dc4e0-2027-45dd-bee8-c5c5346e13f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.509966 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" podUID="b16dc4e0-2027-45dd-bee8-c5c5346e13f5" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.537916 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.538108 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-scglf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-rns58_openstack-operators(dd42edc8-cfba-4913-8d87-7860ecef904f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.539316 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" podUID="dd42edc8-cfba-4913-8d87-7860ecef904f" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.569872 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.570081 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7l2t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-ncb9l_openstack-operators(3fe917e9-4872-4eb1-9bc4-744c81813123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.571243 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" podUID="3fe917e9-4872-4eb1-9bc4-744c81813123" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.605038 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.605243 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcxsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-54m7x_openstack-operators(6085f8d0-d279-4997-a86a-3e539495c9d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.606782 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" podUID="6085f8d0-d279-4997-a86a-3e539495c9d0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.612626 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/openstack-k8s-operators/infra-operator:b7b2753d06abfed166b5d4c6cd6c234324eaeed7" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.612698 4731 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/openstack-k8s-operators/infra-operator:b7b2753d06abfed166b5d4c6cd6c234324eaeed7" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.612989 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.94:5001/openstack-k8s-operators/infra-operator:b7b2753d06abfed166b5d4c6cd6c234324eaeed7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sbnj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-777cfc666b-wx49m_openstack-operators(8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.774793 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.775011 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9kflw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-ddmb5_openstack-operators(2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.776430 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" podUID="2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.810311 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.810465 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4pgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-85q9h_openstack-operators(37985ade-8410-4ecb-af0c-4d7bdd40608a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.811934 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" podUID="37985ade-8410-4ecb-af0c-4d7bdd40608a" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.877434 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.877630 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-966kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-2vmhb_openstack-operators(d8e13eee-1041-4fb9-b8d1-6169c42d5de3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:40 crc kubenswrapper[4731]: E1203 19:09:40.878979 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" podUID="d8e13eee-1041-4fb9-b8d1-6169c42d5de3" Dec 03 19:09:41 crc kubenswrapper[4731]: E1203 19:09:41.153724 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81" Dec 03 19:09:41 crc kubenswrapper[4731]: E1203 19:09:41.154245 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dkvsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq_openstack-operators(e625ea8d-55cc-4749-80f6-2848e064a6bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:09:41 crc kubenswrapper[4731]: E1203 19:09:41.209977 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 19:09:41 crc kubenswrapper[4731]: E1203 19:09:41.210157 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52n8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-d22cz_openstack-operators(1c371c8f-7ede-4286-9d7c-6f65f1323237): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 19:09:41 crc kubenswrapper[4731]: E1203 19:09:41.211493 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" podUID="1c371c8f-7ede-4286-9d7c-6f65f1323237" Dec 03 19:09:41 crc kubenswrapper[4731]: I1203 19:09:41.478991 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" event={"ID":"23b6d2e9-27bc-4944-ba47-a592981fa0d3","Type":"ContainerStarted","Data":"6b76fe19e51bd286352526d10677d80930eff438355c6cfcc45498bfaee3eb56"} Dec 03 19:09:41 crc kubenswrapper[4731]: I1203 19:09:41.740595 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8"] Dec 03 19:09:41 crc kubenswrapper[4731]: W1203 19:09:41.764187 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771782b8_3be4_499f_96b9_5b862de7f654.slice/crio-802d9b149ba0a4ea6aa48e63b7979534b2fa3618f744b589ee4b33c044182a1d WatchSource:0}: Error finding container 802d9b149ba0a4ea6aa48e63b7979534b2fa3618f744b589ee4b33c044182a1d: Status 404 returned error can't find the container with id 802d9b149ba0a4ea6aa48e63b7979534b2fa3618f744b589ee4b33c044182a1d Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.490727 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" event={"ID":"1c371c8f-7ede-4286-9d7c-6f65f1323237","Type":"ContainerStarted","Data":"91e610d4312b40914d12a88338d5189d03b32e59c35e756b7c2c59edc71a3124"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.491520 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.493109 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" event={"ID":"2b4f79c3-66c0-4c91-bfd1-bef243806900","Type":"ContainerStarted","Data":"a944d36a965348291189e15c977faa4492dbc1b3b0bc82c7f1104f8bf3c583a7"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.493917 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.495705 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.496575 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" event={"ID":"771782b8-3be4-499f-96b9-5b862de7f654","Type":"ContainerStarted","Data":"3a662e71b2ca2a8abc79b3d2d59e9eb305611f929476bea0a8965daf9a5b4c3d"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.496609 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" event={"ID":"771782b8-3be4-499f-96b9-5b862de7f654","Type":"ContainerStarted","Data":"802d9b149ba0a4ea6aa48e63b7979534b2fa3618f744b589ee4b33c044182a1d"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.497117 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.499289 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" event={"ID":"23b6d2e9-27bc-4944-ba47-a592981fa0d3","Type":"ContainerStarted","Data":"d01f41450b1bca8c19109afbd1a368c344f41715eb311155718777923426ba58"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.499899 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.503471 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" event={"ID":"362b0743-3165-454e-93b9-b6713d26680b","Type":"ContainerStarted","Data":"c2beb2db999bb23aa78cc29711bfeba66de0ac2e5300860ba3257e4625e478eb"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.503637 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.504535 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.505847 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.505912 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" event={"ID":"c714457d-536a-48fe-8df4-758cff8fb22d","Type":"ContainerStarted","Data":"5774b8e4f5b4f67db179dba76cff8db273588a756e1e481bb335445ba29e577b"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.506549 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.509140 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.509880 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" event={"ID":"36852547-6431-45bc-b56e-6f8261334da2","Type":"ContainerStarted","Data":"d9fae07e54833cbe616a18620973d0f1e6cb24fcc1d12f8367700efe0f5335bb"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.509933 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" event={"ID":"36852547-6431-45bc-b56e-6f8261334da2","Type":"ContainerStarted","Data":"20fac171f45dde8cb4f10ce2fba660c14817be7c7b570004f155620d499bcfd8"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.510171 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.512184 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" event={"ID":"699771e7-b57b-4435-95d9-964c90bbcc3f","Type":"ContainerStarted","Data":"73b21df2d1e0a26585484a19589774f31fdd4ef1f51f3c46b5a5c7141dca1155"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.512404 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.514883 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.514966 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" event={"ID":"2efe3c0e-8643-45f4-920e-17aa65157644","Type":"ContainerStarted","Data":"82bc8a05b4025a5be6d5f8e144377c5f826c51bdad7203f89ac7f81e4468d109"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.515000 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" event={"ID":"2efe3c0e-8643-45f4-920e-17aa65157644","Type":"ContainerStarted","Data":"ffa20b868dedced9cb9921ab14bbe4f12bb43106e8a088b6bb7c4c5a55052a3c"} Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.515209 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.531786 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d22cz" podStartSLOduration=22.26539539 podStartE2EDuration="45.531761502s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.792603276 +0000 UTC m=+860.391197740" lastFinishedPulling="2025-12-03 19:09:23.058969388 +0000 UTC m=+883.657563852" observedRunningTime="2025-12-03 19:09:42.531289087 +0000 UTC m=+903.129883571" watchObservedRunningTime="2025-12-03 19:09:42.531761502 +0000 UTC m=+903.130355976" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.561447 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" podStartSLOduration=18.149326498 podStartE2EDuration="45.56142398s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.211512235 +0000 UTC m=+860.810106699" lastFinishedPulling="2025-12-03 19:09:27.623609717 +0000 UTC m=+888.222204181" observedRunningTime="2025-12-03 19:09:42.557228137 +0000 UTC m=+903.155822611" watchObservedRunningTime="2025-12-03 19:09:42.56142398 +0000 UTC m=+903.160018454" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.610351 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" podStartSLOduration=44.610324655 podStartE2EDuration="44.610324655s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:09:42.604187441 +0000 UTC m=+903.202781925" watchObservedRunningTime="2025-12-03 19:09:42.610324655 +0000 UTC m=+903.208919119" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.639065 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" podStartSLOduration=17.096543799 podStartE2EDuration="44.639027492s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.210504653 +0000 UTC m=+860.809099107" lastFinishedPulling="2025-12-03 19:09:27.752988336 +0000 UTC m=+888.351582800" observedRunningTime="2025-12-03 19:09:42.636457761 +0000 UTC m=+903.235052225" watchObservedRunningTime="2025-12-03 19:09:42.639027492 +0000 UTC m=+903.237621956" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.712800 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-h2ldk" podStartSLOduration=4.317541263 podStartE2EDuration="45.712772174s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.889180608 +0000 UTC m=+860.487775072" lastFinishedPulling="2025-12-03 19:09:41.284411509 +0000 UTC m=+901.883005983" observedRunningTime="2025-12-03 19:09:42.671998874 +0000 UTC m=+903.270593338" watchObservedRunningTime="2025-12-03 19:09:42.712772174 +0000 UTC m=+903.311366638" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.718924 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-428v7" podStartSLOduration=3.626443314 podStartE2EDuration="44.718898316s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.210762851 +0000 UTC m=+860.809357315" lastFinishedPulling="2025-12-03 19:09:41.303217853 +0000 UTC m=+901.901812317" observedRunningTime="2025-12-03 19:09:42.718566947 +0000 UTC m=+903.317161421" watchObservedRunningTime="2025-12-03 19:09:42.718898316 +0000 UTC m=+903.317492780" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.789903 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" podStartSLOduration=17.49965097 podStartE2EDuration="44.789686594s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.332386016 +0000 UTC m=+860.930980480" lastFinishedPulling="2025-12-03 19:09:27.62242164 +0000 UTC m=+888.221016104" observedRunningTime="2025-12-03 19:09:42.76709432 +0000 UTC m=+903.365688794" watchObservedRunningTime="2025-12-03 19:09:42.789686594 +0000 UTC m=+903.388281058" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.825791 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9b2cf" podStartSLOduration=3.7274864279999997 podStartE2EDuration="44.825765444s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.208137238 +0000 UTC m=+860.806731712" lastFinishedPulling="2025-12-03 19:09:41.306416264 +0000 UTC m=+901.905010728" observedRunningTime="2025-12-03 19:09:42.799606378 +0000 UTC m=+903.398200842" watchObservedRunningTime="2025-12-03 19:09:42.825765444 +0000 UTC m=+903.424359908" Dec 03 19:09:42 crc kubenswrapper[4731]: I1203 19:09:42.828338 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2zlnb" podStartSLOduration=3.411123043 podStartE2EDuration="45.828333065s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:58.850446337 +0000 UTC m=+859.449040801" lastFinishedPulling="2025-12-03 19:09:41.267656359 +0000 UTC m=+901.866250823" observedRunningTime="2025-12-03 19:09:42.823134261 +0000 UTC m=+903.421728725" watchObservedRunningTime="2025-12-03 19:09:42.828333065 +0000 UTC m=+903.426927529" Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.532295 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" event={"ID":"cf89e5f1-3460-42f6-b66f-6a556118cd30","Type":"ContainerStarted","Data":"4e1bf9ed58fd8ea662cef5c01b998eef47232ecdb73245c7eb7fcceb27db5476"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.538092 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" event={"ID":"d8e13eee-1041-4fb9-b8d1-6169c42d5de3","Type":"ContainerStarted","Data":"65047ca27c7f51c2e57a84131e2b2193db350e6df45c6f6f5709c4b7e17d0f5b"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.564096 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" event={"ID":"37985ade-8410-4ecb-af0c-4d7bdd40608a","Type":"ContainerStarted","Data":"4923220536dd438249c0047b7d276ae5a70361eb2974a78910ec222fc95e20ea"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.577495 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" event={"ID":"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351","Type":"ContainerStarted","Data":"9f8362444dd370231fceb3719e96487ad4203055410b9d3ae03441a804a6dbae"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.582065 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" event={"ID":"e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3","Type":"ContainerStarted","Data":"f24a1603963725a7576377b35e9b58c3523ea5b557b112b44a9b764b8f80a8cb"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.583244 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.595997 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerStarted","Data":"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f"} Dec 03 19:09:43 crc kubenswrapper[4731]: I1203 19:09:43.625479 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" podStartSLOduration=3.259826138 podStartE2EDuration="45.625447529s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.306401794 +0000 UTC m=+860.904996258" lastFinishedPulling="2025-12-03 19:09:42.672023185 +0000 UTC m=+903.270617649" observedRunningTime="2025-12-03 19:09:43.610896839 +0000 UTC m=+904.209491313" watchObservedRunningTime="2025-12-03 19:09:43.625447529 +0000 UTC m=+904.224042013" Dec 03 19:09:44 crc kubenswrapper[4731]: E1203 19:09:44.046011 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" podUID="8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5" Dec 03 19:09:44 crc kubenswrapper[4731]: E1203 19:09:44.566982 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" podUID="e625ea8d-55cc-4749-80f6-2848e064a6bf" Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.723220 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" event={"ID":"3fe917e9-4872-4eb1-9bc4-744c81813123","Type":"ContainerStarted","Data":"056315fd54c4d237914834e21a2a4117bea84d8a07af177c2527a66fe5f472b2"} Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.724570 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" event={"ID":"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5","Type":"ContainerStarted","Data":"5eb3a239031463fc7dd506bc574e956a95c81b18dc5422f3203420559ef3fdcc"} Dec 03 19:09:44 crc kubenswrapper[4731]: E1203 19:09:44.747446 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/infra-operator:b7b2753d06abfed166b5d4c6cd6c234324eaeed7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" podUID="8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5" Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.809432 4731 generic.go:334] "Generic (PLEG): container finished" podID="bba6fc51-4a48-407c-a78f-fba285b49684" containerID="778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f" exitCode=0 Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.809503 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerDied","Data":"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f"} Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.839676 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" event={"ID":"6085f8d0-d279-4997-a86a-3e539495c9d0","Type":"ContainerStarted","Data":"97142fbe19d932489d790e5e4d29840fd920dd7f7721d3babcc73636f7a758ce"} Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.961520 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" event={"ID":"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f","Type":"ContainerStarted","Data":"f1760c150348f2fe47933455bd406ced2a0d3aafa9ffaa030b36dae42932e6b1"} Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.969959 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" event={"ID":"e625ea8d-55cc-4749-80f6-2848e064a6bf","Type":"ContainerStarted","Data":"4b36c732572115c8f92e133f66c6957c198b901aa22e5170bf263246a3e47136"} Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.974932 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" event={"ID":"f6d78107-9c18-4cca-afa8-a360f45c6bac","Type":"ContainerStarted","Data":"30144a23e36de8312555393f7591c00378fdf339736198fa9004bfdaf0e8e458"} Dec 03 19:09:44 crc kubenswrapper[4731]: E1203 19:09:44.975725 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" podUID="e625ea8d-55cc-4749-80f6-2848e064a6bf" Dec 03 19:09:44 crc kubenswrapper[4731]: I1203 19:09:44.980541 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" event={"ID":"37985ade-8410-4ecb-af0c-4d7bdd40608a","Type":"ContainerStarted","Data":"a06c5473b887c8504d2a0e9fbf5cb6d43385c3192c8ae217d90464ef4cdf2897"} Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.049872 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.066120 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" event={"ID":"ce8b3502-d3bc-463d-b3a4-a758b6b42acb","Type":"ContainerStarted","Data":"c5f462e408302764483decb0878442a1e47e52a8763f3426f0c5ae536d831042"} Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.075924 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" event={"ID":"7f50e5ed-f1dc-4706-8a7c-0d7e85a06351","Type":"ContainerStarted","Data":"03535a823382ea2207e185ae924dc9b8556c8751e51f1f72636f8561d3b81587"} Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.076050 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.079003 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" event={"ID":"b16dc4e0-2027-45dd-bee8-c5c5346e13f5","Type":"ContainerStarted","Data":"c11ca5186a16ac121a54c2e89855fcb68be6528cb3a7fdedac00f5e1712f0734"} Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.079355 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.096322 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-qsgsq" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.139625 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" podStartSLOduration=5.43467072 podStartE2EDuration="48.139606055s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.957481807 +0000 UTC m=+860.556076271" lastFinishedPulling="2025-12-03 19:09:42.662417142 +0000 UTC m=+903.261011606" observedRunningTime="2025-12-03 19:09:45.139372418 +0000 UTC m=+905.737966882" watchObservedRunningTime="2025-12-03 19:09:45.139606055 +0000 UTC m=+905.738200519" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.172033 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" podStartSLOduration=5.506959704 podStartE2EDuration="48.171996849s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.871397015 +0000 UTC m=+860.469991479" lastFinishedPulling="2025-12-03 19:09:42.53643416 +0000 UTC m=+903.135028624" observedRunningTime="2025-12-03 19:09:45.168603111 +0000 UTC m=+905.767197575" watchObservedRunningTime="2025-12-03 19:09:45.171996849 +0000 UTC m=+905.770591313" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.202617 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gwz5z" podStartSLOduration=4.422284379 podStartE2EDuration="47.202596636s" podCreationTimestamp="2025-12-03 19:08:58 +0000 UTC" firstStartedPulling="2025-12-03 19:09:00.172446561 +0000 UTC m=+860.771041025" lastFinishedPulling="2025-12-03 19:09:42.952758818 +0000 UTC m=+903.551353282" observedRunningTime="2025-12-03 19:09:45.199305921 +0000 UTC m=+905.797900385" watchObservedRunningTime="2025-12-03 19:09:45.202596636 +0000 UTC m=+905.801191100" Dec 03 19:09:45 crc kubenswrapper[4731]: I1203 19:09:45.451454 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" podStartSLOduration=5.298107472 podStartE2EDuration="48.45143478s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.915141028 +0000 UTC m=+860.513735482" lastFinishedPulling="2025-12-03 19:09:43.068468326 +0000 UTC m=+903.667062790" observedRunningTime="2025-12-03 19:09:45.230727175 +0000 UTC m=+905.829321639" watchObservedRunningTime="2025-12-03 19:09:45.45143478 +0000 UTC m=+906.050029244" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.148027 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerStarted","Data":"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.150500 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" event={"ID":"cf89e5f1-3460-42f6-b66f-6a556118cd30","Type":"ContainerStarted","Data":"6e12399377d74a976d79dfc799951dc879d2bdbd1e802c8fe93bda3a19ba9846"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.150639 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.153062 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" event={"ID":"2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f","Type":"ContainerStarted","Data":"d203bddb2a863558edf1482d305a746d62d6fd921acb4b528e766159409bc80d"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.153405 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.155472 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" event={"ID":"f6d78107-9c18-4cca-afa8-a360f45c6bac","Type":"ContainerStarted","Data":"c99c73ba190e610c78af70c423aecbe8c7c7bbfa70ca576543ec2748be6f5202"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.155613 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.157247 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" event={"ID":"d8e13eee-1041-4fb9-b8d1-6169c42d5de3","Type":"ContainerStarted","Data":"53d43ad1bb16dcec5dc0461a8282cfbcdf4584665472d1f0f05941feec18c3af"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.157407 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.159503 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" event={"ID":"3fe917e9-4872-4eb1-9bc4-744c81813123","Type":"ContainerStarted","Data":"49f748365756db6f84af959f3130643aabb4c59b5d601f11bc57cb2904353665"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.159672 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.161663 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" event={"ID":"dd42edc8-cfba-4913-8d87-7860ecef904f","Type":"ContainerStarted","Data":"1a164ab3ef160c0d61d9693f359fee150a04078e5d44261d94e5795f4f960444"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.161687 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" event={"ID":"dd42edc8-cfba-4913-8d87-7860ecef904f","Type":"ContainerStarted","Data":"fc51739f83b731c73dff20484ae2ad9f8f16b310b7d6909ecdeaf272db107450"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.161910 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.164527 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" event={"ID":"b16dc4e0-2027-45dd-bee8-c5c5346e13f5","Type":"ContainerStarted","Data":"99142328377fb6cd4368d0f1cd24830abc3e8101ded3e732a8b4397c93c1803f"} Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.166872 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" event={"ID":"6085f8d0-d279-4997-a86a-3e539495c9d0","Type":"ContainerStarted","Data":"c2c9654df6cfa1844bc9572d09c2aab0f323c0268cd2ace63d54f13ed8a5cb48"} Dec 03 19:09:46 crc kubenswrapper[4731]: E1203 19:09:46.169246 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" podUID="e625ea8d-55cc-4749-80f6-2848e064a6bf" Dec 03 19:09:46 crc kubenswrapper[4731]: E1203 19:09:46.169369 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/infra-operator:b7b2753d06abfed166b5d4c6cd6c234324eaeed7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" podUID="8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.215020 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fsx4k" podStartSLOduration=35.250620594 podStartE2EDuration="43.214991234s" podCreationTimestamp="2025-12-03 19:09:03 +0000 UTC" firstStartedPulling="2025-12-03 19:09:37.559460398 +0000 UTC m=+898.158054862" lastFinishedPulling="2025-12-03 19:09:45.523831038 +0000 UTC m=+906.122425502" observedRunningTime="2025-12-03 19:09:46.187440062 +0000 UTC m=+906.786034526" watchObservedRunningTime="2025-12-03 19:09:46.214991234 +0000 UTC m=+906.813585698" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.217869 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" podStartSLOduration=5.771809195 podStartE2EDuration="49.217857504s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.880659728 +0000 UTC m=+860.479254192" lastFinishedPulling="2025-12-03 19:09:43.326708037 +0000 UTC m=+903.925302501" observedRunningTime="2025-12-03 19:09:46.209767168 +0000 UTC m=+906.808361632" watchObservedRunningTime="2025-12-03 19:09:46.217857504 +0000 UTC m=+906.816451968" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.626674 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" podStartSLOduration=6.019472353 podStartE2EDuration="49.626647704s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.957382243 +0000 UTC m=+860.555976707" lastFinishedPulling="2025-12-03 19:09:43.564557594 +0000 UTC m=+904.163152058" observedRunningTime="2025-12-03 19:09:46.588634422 +0000 UTC m=+907.187228906" watchObservedRunningTime="2025-12-03 19:09:46.626647704 +0000 UTC m=+907.225242178" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.631469 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" podStartSLOduration=6.170341041 podStartE2EDuration="49.631443766s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.880872395 +0000 UTC m=+860.479466859" lastFinishedPulling="2025-12-03 19:09:43.34197512 +0000 UTC m=+903.940569584" observedRunningTime="2025-12-03 19:09:46.311325788 +0000 UTC m=+906.909920252" watchObservedRunningTime="2025-12-03 19:09:46.631443766 +0000 UTC m=+907.230038230" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.755204 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" podStartSLOduration=5.858471145 podStartE2EDuration="49.755185057s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.201453052 +0000 UTC m=+859.800047506" lastFinishedPulling="2025-12-03 19:09:43.098166954 +0000 UTC m=+903.696761418" observedRunningTime="2025-12-03 19:09:46.749120785 +0000 UTC m=+907.347715249" watchObservedRunningTime="2025-12-03 19:09:46.755185057 +0000 UTC m=+907.353779521" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.782771 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" podStartSLOduration=6.434665865 podStartE2EDuration="49.782746428s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.969245538 +0000 UTC m=+860.567840002" lastFinishedPulling="2025-12-03 19:09:43.317326101 +0000 UTC m=+903.915920565" observedRunningTime="2025-12-03 19:09:46.781646353 +0000 UTC m=+907.380240817" watchObservedRunningTime="2025-12-03 19:09:46.782746428 +0000 UTC m=+907.381340922" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.802944 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" podStartSLOduration=7.039828213 podStartE2EDuration="49.802923626s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.880359129 +0000 UTC m=+860.478953593" lastFinishedPulling="2025-12-03 19:09:42.643454542 +0000 UTC m=+903.242049006" observedRunningTime="2025-12-03 19:09:46.800694785 +0000 UTC m=+907.399289249" watchObservedRunningTime="2025-12-03 19:09:46.802923626 +0000 UTC m=+907.401518090" Dec 03 19:09:46 crc kubenswrapper[4731]: I1203 19:09:46.968597 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" podStartSLOduration=6.813107676 podStartE2EDuration="49.968573231s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:08:59.47131463 +0000 UTC m=+860.069909094" lastFinishedPulling="2025-12-03 19:09:42.626780195 +0000 UTC m=+903.225374649" observedRunningTime="2025-12-03 19:09:46.96474492 +0000 UTC m=+907.563339394" watchObservedRunningTime="2025-12-03 19:09:46.968573231 +0000 UTC m=+907.567167695" Dec 03 19:09:47 crc kubenswrapper[4731]: I1203 19:09:47.175313 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.110775 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-2vmhb" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.174952 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-bqv9t" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.190896 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-w54rz" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.305038 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rpprb" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.435995 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ncb9l" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.570728 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-85q9h" Dec 03 19:09:48 crc kubenswrapper[4731]: I1203 19:09:48.953546 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-gt4xb" Dec 03 19:09:49 crc kubenswrapper[4731]: I1203 19:09:49.007106 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-z74rk" Dec 03 19:09:49 crc kubenswrapper[4731]: I1203 19:09:49.120434 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zlg7s" Dec 03 19:09:50 crc kubenswrapper[4731]: I1203 19:09:50.793063 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5d8f48999-r8rg8" Dec 03 19:09:53 crc kubenswrapper[4731]: I1203 19:09:53.361133 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:53 crc kubenswrapper[4731]: I1203 19:09:53.361463 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:53 crc kubenswrapper[4731]: I1203 19:09:53.418375 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:54 crc kubenswrapper[4731]: I1203 19:09:54.267146 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:54 crc kubenswrapper[4731]: I1203 19:09:54.371274 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.238025 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fsx4k" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="registry-server" containerID="cri-o://91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d" gracePeriod=2 Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.469090 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.469748 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.655704 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.835415 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpwrz\" (UniqueName: \"kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz\") pod \"bba6fc51-4a48-407c-a78f-fba285b49684\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.835471 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities\") pod \"bba6fc51-4a48-407c-a78f-fba285b49684\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.835535 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content\") pod \"bba6fc51-4a48-407c-a78f-fba285b49684\" (UID: \"bba6fc51-4a48-407c-a78f-fba285b49684\") " Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.836647 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities" (OuterVolumeSpecName: "utilities") pod "bba6fc51-4a48-407c-a78f-fba285b49684" (UID: "bba6fc51-4a48-407c-a78f-fba285b49684"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.844464 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz" (OuterVolumeSpecName: "kube-api-access-wpwrz") pod "bba6fc51-4a48-407c-a78f-fba285b49684" (UID: "bba6fc51-4a48-407c-a78f-fba285b49684"). InnerVolumeSpecName "kube-api-access-wpwrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.854033 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bba6fc51-4a48-407c-a78f-fba285b49684" (UID: "bba6fc51-4a48-407c-a78f-fba285b49684"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.936964 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpwrz\" (UniqueName: \"kubernetes.io/projected/bba6fc51-4a48-407c-a78f-fba285b49684-kube-api-access-wpwrz\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.936999 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:56 crc kubenswrapper[4731]: I1203 19:09:56.937009 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bba6fc51-4a48-407c-a78f-fba285b49684-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.248067 4731 generic.go:334] "Generic (PLEG): container finished" podID="bba6fc51-4a48-407c-a78f-fba285b49684" containerID="91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d" exitCode=0 Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.248160 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerDied","Data":"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d"} Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.248214 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsx4k" event={"ID":"bba6fc51-4a48-407c-a78f-fba285b49684","Type":"ContainerDied","Data":"76c74a8caf8a7e5b40476223e6cbabed5c66987a47a2911fba0dee770e33a302"} Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.248241 4731 scope.go:117] "RemoveContainer" containerID="91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.248292 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsx4k" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.271219 4731 scope.go:117] "RemoveContainer" containerID="778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.314487 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.321278 4731 scope.go:117] "RemoveContainer" containerID="f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.330288 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsx4k"] Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.343517 4731 scope.go:117] "RemoveContainer" containerID="91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d" Dec 03 19:09:57 crc kubenswrapper[4731]: E1203 19:09:57.348688 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d\": container with ID starting with 91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d not found: ID does not exist" containerID="91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.348734 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d"} err="failed to get container status \"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d\": rpc error: code = NotFound desc = could not find container \"91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d\": container with ID starting with 91345ab325d91d5fb4bb0bef9f213d41a67dd5c25dfe7002b9bf63b3e37f593d not found: ID does not exist" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.348767 4731 scope.go:117] "RemoveContainer" containerID="778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f" Dec 03 19:09:57 crc kubenswrapper[4731]: E1203 19:09:57.349100 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f\": container with ID starting with 778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f not found: ID does not exist" containerID="778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.349129 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f"} err="failed to get container status \"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f\": rpc error: code = NotFound desc = could not find container \"778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f\": container with ID starting with 778a58cb3797b2e7566498f03c7d5b9aa82ce229c8aac64776d432f9acf3988f not found: ID does not exist" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.349146 4731 scope.go:117] "RemoveContainer" containerID="f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4" Dec 03 19:09:57 crc kubenswrapper[4731]: E1203 19:09:57.349376 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4\": container with ID starting with f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4 not found: ID does not exist" containerID="f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.349402 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4"} err="failed to get container status \"f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4\": rpc error: code = NotFound desc = could not find container \"f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4\": container with ID starting with f3f206c28ff6720f5fdd2ae03a7818668cf9cf915220d24a702f0ff0d2dfbfd4 not found: ID does not exist" Dec 03 19:09:57 crc kubenswrapper[4731]: I1203 19:09:57.868857 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" path="/var/lib/kubelet/pods/bba6fc51-4a48-407c-a78f-fba285b49684/volumes" Dec 03 19:09:58 crc kubenswrapper[4731]: I1203 19:09:58.064926 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rlbmk" Dec 03 19:09:58 crc kubenswrapper[4731]: I1203 19:09:58.316800 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-rns58" Dec 03 19:09:58 crc kubenswrapper[4731]: I1203 19:09:58.404714 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-54m7x" Dec 03 19:09:58 crc kubenswrapper[4731]: I1203 19:09:58.619484 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-ddmb5" Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.294950 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" event={"ID":"e625ea8d-55cc-4749-80f6-2848e064a6bf","Type":"ContainerStarted","Data":"970d9bd41b375d9641bfe70332c9ef014c640e2fa23b6fa6610f15add56ac7e0"} Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.296663 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.298389 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" event={"ID":"8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5","Type":"ContainerStarted","Data":"7b8f45e465c229f2697bda41e7e426c3b8f2556675505aaa19b4ed46a42185f6"} Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.298852 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.321465 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" podStartSLOduration=31.248107395 podStartE2EDuration="1m5.321442431s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:09:27.615198361 +0000 UTC m=+888.213792825" lastFinishedPulling="2025-12-03 19:10:01.688533397 +0000 UTC m=+922.287127861" observedRunningTime="2025-12-03 19:10:02.32043909 +0000 UTC m=+922.919033574" watchObservedRunningTime="2025-12-03 19:10:02.321442431 +0000 UTC m=+922.920036905" Dec 03 19:10:02 crc kubenswrapper[4731]: I1203 19:10:02.350925 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" podStartSLOduration=30.516918774 podStartE2EDuration="1m5.350903132s" podCreationTimestamp="2025-12-03 19:08:57 +0000 UTC" firstStartedPulling="2025-12-03 19:09:27.083411833 +0000 UTC m=+887.682006297" lastFinishedPulling="2025-12-03 19:10:01.917396191 +0000 UTC m=+922.515990655" observedRunningTime="2025-12-03 19:10:02.347839045 +0000 UTC m=+922.946433509" watchObservedRunningTime="2025-12-03 19:10:02.350903132 +0000 UTC m=+922.949497616" Dec 03 19:10:14 crc kubenswrapper[4731]: I1203 19:10:14.148234 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-777cfc666b-wx49m" Dec 03 19:10:14 crc kubenswrapper[4731]: I1203 19:10:14.542078 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq" Dec 03 19:10:26 crc kubenswrapper[4731]: I1203 19:10:26.469105 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:10:26 crc kubenswrapper[4731]: I1203 19:10:26.470751 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:10:26 crc kubenswrapper[4731]: I1203 19:10:26.470909 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:10:26 crc kubenswrapper[4731]: I1203 19:10:26.471608 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:10:26 crc kubenswrapper[4731]: I1203 19:10:26.471745 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295" gracePeriod=600 Dec 03 19:10:27 crc kubenswrapper[4731]: I1203 19:10:27.581812 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295" exitCode=0 Dec 03 19:10:27 crc kubenswrapper[4731]: I1203 19:10:27.581894 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295"} Dec 03 19:10:27 crc kubenswrapper[4731]: I1203 19:10:27.582311 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47"} Dec 03 19:10:27 crc kubenswrapper[4731]: I1203 19:10:27.582361 4731 scope.go:117] "RemoveContainer" containerID="800db96b32fec13e6990bc15d820baddf81db70da12482cb022dfa84b57e785b" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.892879 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894674 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="extract-utilities" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894691 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="extract-utilities" Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894706 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="extract-utilities" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894716 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="extract-utilities" Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894755 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894765 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894792 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="extract-content" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894799 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="extract-content" Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894830 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894846 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: E1203 19:10:28.894866 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="extract-content" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.894873 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="extract-content" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.895759 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba6fc51-4a48-407c-a78f-fba285b49684" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.895797 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="564cdd3d-bea0-4d91-b1d3-b483b550209d" containerName="registry-server" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.905433 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.910177 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.910331 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.911795 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.910454 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ff8r4" Dec 03 19:10:28 crc kubenswrapper[4731]: I1203 19:10:28.930495 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.038838 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.038985 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fb8\" (UniqueName: \"kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.140415 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.140478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fb8\" (UniqueName: \"kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.141787 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.170765 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fb8\" (UniqueName: \"kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8\") pod \"dnsmasq-dns-675f4bcbfc-nd6m6\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.233751 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:10:29 crc kubenswrapper[4731]: I1203 19:10:29.737043 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:10:29 crc kubenswrapper[4731]: W1203 19:10:29.746129 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f3bbd9e_0b1e_4199_b050_07d7d110e502.slice/crio-ed5dca749417ad1eadc3873978388db8513ea564a687851f1b555a2bf3559295 WatchSource:0}: Error finding container ed5dca749417ad1eadc3873978388db8513ea564a687851f1b555a2bf3559295: Status 404 returned error can't find the container with id ed5dca749417ad1eadc3873978388db8513ea564a687851f1b555a2bf3559295 Dec 03 19:10:30 crc kubenswrapper[4731]: I1203 19:10:30.612998 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" event={"ID":"9f3bbd9e-0b1e-4199-b050-07d7d110e502","Type":"ContainerStarted","Data":"ed5dca749417ad1eadc3873978388db8513ea564a687851f1b555a2bf3559295"} Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.791143 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.793068 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.796663 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2rjh6" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.796973 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.797270 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.797440 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.797972 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.798105 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.798374 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.820566 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851082 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851155 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851201 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851217 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851235 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851270 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851296 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851314 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fm4g\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851332 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851362 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.851379 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952489 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952575 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952604 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952632 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fm4g\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952661 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952731 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952759 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952802 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952868 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952927 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.952956 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.953604 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.954591 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.954721 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.954978 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.955383 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.981178 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.982973 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.983603 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.985768 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.986616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fm4g\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:33 crc kubenswrapper[4731]: I1203 19:10:33.989764 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.000840 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " pod="openstack/rabbitmq-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.088075 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.093424 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098144 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098270 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098329 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098437 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098649 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.098825 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5qvkc" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.099003 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.102871 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.140738 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258068 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258164 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258192 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258228 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258248 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzj2\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258297 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258317 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258332 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258369 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258402 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.258453 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360095 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360158 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360200 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360231 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360300 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360320 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzj2\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360344 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360359 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360374 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360391 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360420 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.360714 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.362962 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.363999 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.364037 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.364435 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.364735 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.365887 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.367157 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.370794 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.374969 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.377042 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzj2\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.378874 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.454603 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.931599 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.933154 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.936763 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.936840 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wrj5r" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.937231 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.937372 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.938403 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 19:10:34 crc kubenswrapper[4731]: I1203 19:10:34.945170 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.053835 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.053883 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.053913 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.054056 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znt8k\" (UniqueName: \"kubernetes.io/projected/2d74ba4a-c904-441f-871b-57c691c528e2-kube-api-access-znt8k\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.054138 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.054191 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.054213 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.054306 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.156493 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.156632 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.156657 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157396 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157497 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157547 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157678 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znt8k\" (UniqueName: \"kubernetes.io/projected/2d74ba4a-c904-441f-871b-57c691c528e2-kube-api-access-znt8k\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157729 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157759 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.157777 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.158286 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.158312 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d74ba4a-c904-441f-871b-57c691c528e2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.159180 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d74ba4a-c904-441f-871b-57c691c528e2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.250228 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znt8k\" (UniqueName: \"kubernetes.io/projected/2d74ba4a-c904-441f-871b-57c691c528e2-kube-api-access-znt8k\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.250779 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.251401 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.254249 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d74ba4a-c904-441f-871b-57c691c528e2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d74ba4a-c904-441f-871b-57c691c528e2\") " pod="openstack/openstack-galera-0" Dec 03 19:10:35 crc kubenswrapper[4731]: I1203 19:10:35.635849 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.398016 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.399641 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.402368 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.402941 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5xpt9" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.403053 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.409004 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.417791 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.445002 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.446150 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.452123 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.452522 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9gj97" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.452706 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.465113 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.554850 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.554967 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.555036 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.555068 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.555235 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-config-data\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.555725 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.555806 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kolla-config\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556006 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92j99\" (UniqueName: \"kubernetes.io/projected/d340d800-c6f0-4375-81ed-d993a19950dd-kube-api-access-92j99\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556156 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8dg\" (UniqueName: \"kubernetes.io/projected/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kube-api-access-6s8dg\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556283 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556671 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556767 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.556865 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659278 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659363 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659397 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659470 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659508 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-config-data\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659554 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659592 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kolla-config\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659645 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92j99\" (UniqueName: \"kubernetes.io/projected/d340d800-c6f0-4375-81ed-d993a19950dd-kube-api-access-92j99\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659697 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8dg\" (UniqueName: \"kubernetes.io/projected/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kube-api-access-6s8dg\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659733 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659767 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659797 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.659826 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.660215 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.660476 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.660500 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.660503 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kolla-config\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.660713 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/286db73c-ad17-4f3b-aeb8-d8423872a2a1-config-data\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.661413 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.665913 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d340d800-c6f0-4375-81ed-d993a19950dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.667111 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.667645 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.672192 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d340d800-c6f0-4375-81ed-d993a19950dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.681835 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/286db73c-ad17-4f3b-aeb8-d8423872a2a1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.682983 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8dg\" (UniqueName: \"kubernetes.io/projected/286db73c-ad17-4f3b-aeb8-d8423872a2a1-kube-api-access-6s8dg\") pod \"memcached-0\" (UID: \"286db73c-ad17-4f3b-aeb8-d8423872a2a1\") " pod="openstack/memcached-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.702647 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.709836 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92j99\" (UniqueName: \"kubernetes.io/projected/d340d800-c6f0-4375-81ed-d993a19950dd-kube-api-access-92j99\") pod \"openstack-cell1-galera-0\" (UID: \"d340d800-c6f0-4375-81ed-d993a19950dd\") " pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.719847 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 19:10:36 crc kubenswrapper[4731]: I1203 19:10:36.775023 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.653167 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.654691 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.657739 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lx2md" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.661802 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.810587 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtfm\" (UniqueName: \"kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm\") pod \"kube-state-metrics-0\" (UID: \"8a549a4d-e989-42bf-8e74-556e6feb9507\") " pod="openstack/kube-state-metrics-0" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.911836 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtfm\" (UniqueName: \"kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm\") pod \"kube-state-metrics-0\" (UID: \"8a549a4d-e989-42bf-8e74-556e6feb9507\") " pod="openstack/kube-state-metrics-0" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.929737 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtfm\" (UniqueName: \"kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm\") pod \"kube-state-metrics-0\" (UID: \"8a549a4d-e989-42bf-8e74-556e6feb9507\") " pod="openstack/kube-state-metrics-0" Dec 03 19:10:38 crc kubenswrapper[4731]: I1203 19:10:38.989159 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.022801 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s8zdm"] Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.024410 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.028026 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cpszx" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.028224 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.028420 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.041515 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s8zdm"] Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067676 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067779 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067839 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xvk\" (UniqueName: \"kubernetes.io/projected/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-kube-api-access-x2xvk\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067929 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-combined-ca-bundle\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067960 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-scripts\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.067997 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-log-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.068047 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-ovn-controller-tls-certs\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.098805 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4rrp7"] Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.100557 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.106548 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4rrp7"] Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172491 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xvk\" (UniqueName: \"kubernetes.io/projected/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-kube-api-access-x2xvk\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172558 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-lib\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172590 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-log\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172642 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-combined-ca-bundle\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172661 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-scripts\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172684 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-log-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172705 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-etc-ovs\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172735 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-run\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172755 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-ovn-controller-tls-certs\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172775 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172796 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh85q\" (UniqueName: \"kubernetes.io/projected/c42cb210-65c5-43db-84de-7ecf24807aab-kube-api-access-xh85q\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172817 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.172838 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42cb210-65c5-43db-84de-7ecf24807aab-scripts\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.174617 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.175102 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-scripts\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.175294 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-log-ovn\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.175408 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-var-run\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.180305 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-ovn-controller-tls-certs\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.192197 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xvk\" (UniqueName: \"kubernetes.io/projected/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-kube-api-access-x2xvk\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.199731 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3-combined-ca-bundle\") pod \"ovn-controller-s8zdm\" (UID: \"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3\") " pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274266 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-etc-ovs\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274343 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-run\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274391 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh85q\" (UniqueName: \"kubernetes.io/projected/c42cb210-65c5-43db-84de-7ecf24807aab-kube-api-access-xh85q\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274435 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42cb210-65c5-43db-84de-7ecf24807aab-scripts\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274482 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-lib\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274519 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-log\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274836 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-log\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274844 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-run\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.274900 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-etc-ovs\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.275112 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c42cb210-65c5-43db-84de-7ecf24807aab-var-lib\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.277724 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c42cb210-65c5-43db-84de-7ecf24807aab-scripts\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.300347 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh85q\" (UniqueName: \"kubernetes.io/projected/c42cb210-65c5-43db-84de-7ecf24807aab-kube-api-access-xh85q\") pod \"ovn-controller-ovs-4rrp7\" (UID: \"c42cb210-65c5-43db-84de-7ecf24807aab\") " pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.350861 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm" Dec 03 19:10:42 crc kubenswrapper[4731]: I1203 19:10:42.421294 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.938083 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.940304 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.942301 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.942739 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.942921 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dstdm" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.942925 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.943792 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 19:10:43 crc kubenswrapper[4731]: I1203 19:10:43.955780 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.105638 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.105961 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.105989 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.106019 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.106071 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2dvl\" (UniqueName: \"kubernetes.io/projected/d88aab08-1249-4391-b0e2-ec8f0704e7c3-kube-api-access-v2dvl\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.106110 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.106168 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.106189 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207644 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207704 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207733 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207752 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207773 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2dvl\" (UniqueName: \"kubernetes.io/projected/d88aab08-1249-4391-b0e2-ec8f0704e7c3-kube-api-access-v2dvl\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207800 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207840 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.207858 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.209212 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.209512 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.209646 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.209956 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88aab08-1249-4391-b0e2-ec8f0704e7c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.213989 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.214123 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.214132 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d88aab08-1249-4391-b0e2-ec8f0704e7c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.225736 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2dvl\" (UniqueName: \"kubernetes.io/projected/d88aab08-1249-4391-b0e2-ec8f0704e7c3-kube-api-access-v2dvl\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.231784 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d88aab08-1249-4391-b0e2-ec8f0704e7c3\") " pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:44 crc kubenswrapper[4731]: I1203 19:10:44.330094 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.939881 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.941806 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.954884 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rlcmg" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.955117 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.955277 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.955273 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 19:10:45 crc kubenswrapper[4731]: I1203 19:10:45.981847 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069500 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069605 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069765 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069899 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069933 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.069960 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4pc\" (UniqueName: \"kubernetes.io/projected/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-kube-api-access-qp4pc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.070082 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.070281 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.171886 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.171936 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.171970 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.171995 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172014 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4pc\" (UniqueName: \"kubernetes.io/projected/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-kube-api-access-qp4pc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172050 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172085 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172128 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172191 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.172785 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.173720 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.175273 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.180246 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.181809 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.182594 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.191990 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4pc\" (UniqueName: \"kubernetes.io/projected/d2a6487b-2b25-4b00-a4c3-4c11caa0da2b-kube-api-access-qp4pc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.196283 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b\") " pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:46 crc kubenswrapper[4731]: I1203 19:10:46.263160 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 19:10:54 crc kubenswrapper[4731]: I1203 19:10:54.268038 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:10:55 crc kubenswrapper[4731]: E1203 19:10:55.004346 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 19:10:55 crc kubenswrapper[4731]: E1203 19:10:55.004529 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5fb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-nd6m6_openstack(9f3bbd9e-0b1e-4199-b050-07d7d110e502): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:10:55 crc kubenswrapper[4731]: E1203 19:10:55.005775 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" Dec 03 19:10:55 crc kubenswrapper[4731]: I1203 19:10:55.186991 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerStarted","Data":"1f3b041aeeab1b3901a03578b28b9d16a47457ab118073db9a879c43c20cc1a5"} Dec 03 19:10:55 crc kubenswrapper[4731]: E1203 19:10:55.189190 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.315024 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4rrp7"] Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.483318 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.493136 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s8zdm"] Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.504419 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.507525 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f5bdad_d0e4_418c_b6f0_0c1c6f0da2c3.slice/crio-34cb490a37a81c1229eaf8c6c59185202a5aa58ab9498abeddd219b251ab6222 WatchSource:0}: Error finding container 34cb490a37a81c1229eaf8c6c59185202a5aa58ab9498abeddd219b251ab6222: Status 404 returned error can't find the container with id 34cb490a37a81c1229eaf8c6c59185202a5aa58ab9498abeddd219b251ab6222 Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.512459 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd340d800_c6f0_4375_81ed_d993a19950dd.slice/crio-6aed8bccca277f1deaa5e898c144936e884c9d6f2f4e4764f146e55361beaf20 WatchSource:0}: Error finding container 6aed8bccca277f1deaa5e898c144936e884c9d6f2f4e4764f146e55361beaf20: Status 404 returned error can't find the container with id 6aed8bccca277f1deaa5e898c144936e884c9d6f2f4e4764f146e55361beaf20 Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.514185 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod286db73c_ad17_4f3b_aeb8_d8423872a2a1.slice/crio-1a2b588ca6c9f32ed59781363242e665215112ded97d6ee5ffbb7251c16e2735 WatchSource:0}: Error finding container 1a2b588ca6c9f32ed59781363242e665215112ded97d6ee5ffbb7251c16e2735: Status 404 returned error can't find the container with id 1a2b588ca6c9f32ed59781363242e665215112ded97d6ee5ffbb7251c16e2735 Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.514207 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.521334 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d74ba4a_c904_441f_871b_57c691c528e2.slice/crio-36cb04718e7d6c460cc0531a8e3d44feb7c87a116d970a82394095086262d827 WatchSource:0}: Error finding container 36cb04718e7d6c460cc0531a8e3d44feb7c87a116d970a82394095086262d827: Status 404 returned error can't find the container with id 36cb04718e7d6c460cc0531a8e3d44feb7c87a116d970a82394095086262d827 Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.521806 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.523802 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469480bc_e167_4ecc_87c4_9691057d999f.slice/crio-24178e0a31d94ef5ea9772de56e020cdbc29116d7a7c895ba997a0a7209b4476 WatchSource:0}: Error finding container 24178e0a31d94ef5ea9772de56e020cdbc29116d7a7c895ba997a0a7209b4476: Status 404 returned error can't find the container with id 24178e0a31d94ef5ea9772de56e020cdbc29116d7a7c895ba997a0a7209b4476 Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.528803 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.601592 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 19:10:56 crc kubenswrapper[4731]: W1203 19:10:56.695583 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd88aab08_1249_4391_b0e2_ec8f0704e7c3.slice/crio-86020a4361505560b2d14e2e6442995cd92f8a44f669a59532ff50185725714a WatchSource:0}: Error finding container 86020a4361505560b2d14e2e6442995cd92f8a44f669a59532ff50185725714a: Status 404 returned error can't find the container with id 86020a4361505560b2d14e2e6442995cd92f8a44f669a59532ff50185725714a Dec 03 19:10:56 crc kubenswrapper[4731]: I1203 19:10:56.705315 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.201364 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d340d800-c6f0-4375-81ed-d993a19950dd","Type":"ContainerStarted","Data":"6aed8bccca277f1deaa5e898c144936e884c9d6f2f4e4764f146e55361beaf20"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.202789 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d88aab08-1249-4391-b0e2-ec8f0704e7c3","Type":"ContainerStarted","Data":"86020a4361505560b2d14e2e6442995cd92f8a44f669a59532ff50185725714a"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.204056 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rrp7" event={"ID":"c42cb210-65c5-43db-84de-7ecf24807aab","Type":"ContainerStarted","Data":"30cf8697fe741da1af94a729e518c5f384a6ed50efc78c06c61e07fc29afc148"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.205583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a549a4d-e989-42bf-8e74-556e6feb9507","Type":"ContainerStarted","Data":"9e857202732871b12e49db354f409e4a09f31780c6d670bfb5969ca88f85d937"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.207285 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"286db73c-ad17-4f3b-aeb8-d8423872a2a1","Type":"ContainerStarted","Data":"1a2b588ca6c9f32ed59781363242e665215112ded97d6ee5ffbb7251c16e2735"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.208526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerStarted","Data":"24178e0a31d94ef5ea9772de56e020cdbc29116d7a7c895ba997a0a7209b4476"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.210023 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b","Type":"ContainerStarted","Data":"8c6c7f833d2d3580610978a6d38161fb9f3eb2edc9478beac2f5190dcbb94eb9"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.211526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s8zdm" event={"ID":"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3","Type":"ContainerStarted","Data":"34cb490a37a81c1229eaf8c6c59185202a5aa58ab9498abeddd219b251ab6222"} Dec 03 19:10:57 crc kubenswrapper[4731]: I1203 19:10:57.212952 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d74ba4a-c904-441f-871b-57c691c528e2","Type":"ContainerStarted","Data":"36cb04718e7d6c460cc0531a8e3d44feb7c87a116d970a82394095086262d827"} Dec 03 19:10:59 crc kubenswrapper[4731]: I1203 19:10:59.269103 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerStarted","Data":"fc32c944255cdac00d14416c11b5771f9a9f1e781738b3debf50039e4d9a17fd"} Dec 03 19:11:00 crc kubenswrapper[4731]: I1203 19:11:00.297270 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerStarted","Data":"58682561d8a5f5c7c84ae4f7bf28b58db8754a15808e79e7e8013b7ec94685b2"} Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.029459 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.030290 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbh5h5f4h5d7h54dh95h56chb5hdbh659h9bh64bh64ch66h8dh546hbfh699h5cfh5h57fh647h64h6fh56bh678h674h66chfch5f5h5f5h6q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2dvl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(d88aab08-1249-4391-b0e2-ec8f0704e7c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.421187 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.421685 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58dh5f9h8fh6dh97hfdh585h667hc9h58hbh694h57dh8ch665hd5h646h5bh66bh555h57fh5cfh55bh5d4h5bdh88h59ch5c5h667hchchf5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2xvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-s8zdm_openstack(d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.423137 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-s8zdm" podUID="d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3" Dec 03 19:11:09 crc kubenswrapper[4731]: E1203 19:11:09.591286 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-s8zdm" podUID="d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3" Dec 03 19:11:10 crc kubenswrapper[4731]: E1203 19:11:10.244598 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 19:11:10 crc kubenswrapper[4731]: E1203 19:11:10.245220 4731 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 19:11:10 crc kubenswrapper[4731]: E1203 19:11:10.245529 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rwtfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(8a549a4d-e989-42bf-8e74-556e6feb9507): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 19:11:10 crc kubenswrapper[4731]: E1203 19:11:10.247010 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.596509 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"286db73c-ad17-4f3b-aeb8-d8423872a2a1","Type":"ContainerStarted","Data":"aa38373eb536153fdbf14e6110aa1428ec9e69c4839988970927ad35ef4c95d3"} Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.596957 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.598422 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b","Type":"ContainerStarted","Data":"01273cd4f842a0184eecd999a1657f04fc5415e29cf22e97d122a065229a6c32"} Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.600778 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d74ba4a-c904-441f-871b-57c691c528e2","Type":"ContainerStarted","Data":"2225b927de81561237a5f69a2f608eb09ec90e0ea03ce2de662be4687bf5b28c"} Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.603899 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d340d800-c6f0-4375-81ed-d993a19950dd","Type":"ContainerStarted","Data":"d741c5f180bbb245b2a448f57d42b27993fc00b23db645afbd41cecd7a795b59"} Dec 03 19:11:10 crc kubenswrapper[4731]: E1203 19:11:10.605077 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" Dec 03 19:11:10 crc kubenswrapper[4731]: I1203 19:11:10.623474 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.72049762 podStartE2EDuration="34.623444637s" podCreationTimestamp="2025-12-03 19:10:36 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.516463398 +0000 UTC m=+977.115057862" lastFinishedPulling="2025-12-03 19:11:09.419410375 +0000 UTC m=+990.018004879" observedRunningTime="2025-12-03 19:11:10.621064383 +0000 UTC m=+991.219658847" watchObservedRunningTime="2025-12-03 19:11:10.623444637 +0000 UTC m=+991.222039131" Dec 03 19:11:11 crc kubenswrapper[4731]: I1203 19:11:11.611975 4731 generic.go:334] "Generic (PLEG): container finished" podID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerID="c33c07bcf9c228b0c115d8a6fd6436f358c76979be950eb742dc8ff2b2237623" exitCode=0 Dec 03 19:11:11 crc kubenswrapper[4731]: I1203 19:11:11.612092 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" event={"ID":"9f3bbd9e-0b1e-4199-b050-07d7d110e502","Type":"ContainerDied","Data":"c33c07bcf9c228b0c115d8a6fd6436f358c76979be950eb742dc8ff2b2237623"} Dec 03 19:11:11 crc kubenswrapper[4731]: I1203 19:11:11.615060 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rrp7" event={"ID":"c42cb210-65c5-43db-84de-7ecf24807aab","Type":"ContainerStarted","Data":"a5bf6b9c314ca09608fac3bc6aa2e5d801dd1f0221dd438bc0f3a4b6add4cd4d"} Dec 03 19:11:12 crc kubenswrapper[4731]: I1203 19:11:12.626503 4731 generic.go:334] "Generic (PLEG): container finished" podID="c42cb210-65c5-43db-84de-7ecf24807aab" containerID="a5bf6b9c314ca09608fac3bc6aa2e5d801dd1f0221dd438bc0f3a4b6add4cd4d" exitCode=0 Dec 03 19:11:12 crc kubenswrapper[4731]: I1203 19:11:12.626561 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rrp7" event={"ID":"c42cb210-65c5-43db-84de-7ecf24807aab","Type":"ContainerDied","Data":"a5bf6b9c314ca09608fac3bc6aa2e5d801dd1f0221dd438bc0f3a4b6add4cd4d"} Dec 03 19:11:13 crc kubenswrapper[4731]: E1203 19:11:13.333928 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="d88aab08-1249-4391-b0e2-ec8f0704e7c3" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.636966 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" event={"ID":"9f3bbd9e-0b1e-4199-b050-07d7d110e502","Type":"ContainerStarted","Data":"2077ff89c13eb358308e490050ddc1f16268e960c33165fadc297c005ec34f22"} Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.637755 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.639045 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d2a6487b-2b25-4b00-a4c3-4c11caa0da2b","Type":"ContainerStarted","Data":"5320e359db7c695a558bc6992a73252fa18609bf0df412f29b465483ab3f8798"} Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.641202 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d88aab08-1249-4391-b0e2-ec8f0704e7c3","Type":"ContainerStarted","Data":"94f4eb4167dd6863e3c56249f635038dbac050661fb4fa5bda8e9d5e7ac5813a"} Dec 03 19:11:13 crc kubenswrapper[4731]: E1203 19:11:13.643078 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="d88aab08-1249-4391-b0e2-ec8f0704e7c3" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.644711 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rrp7" event={"ID":"c42cb210-65c5-43db-84de-7ecf24807aab","Type":"ContainerStarted","Data":"6cbe282788c1e45cc0568f098ae63672f0d4411b8e39d4e41b529951e41ff3c2"} Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.644815 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rrp7" event={"ID":"c42cb210-65c5-43db-84de-7ecf24807aab","Type":"ContainerStarted","Data":"66201c83b9779a6412a206942b5f3a273b913803ab8eeb08176d8935bf11b4a8"} Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.644904 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.644993 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.664237 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" podStartSLOduration=4.809337827 podStartE2EDuration="45.664214817s" podCreationTimestamp="2025-12-03 19:10:28 +0000 UTC" firstStartedPulling="2025-12-03 19:10:29.748868028 +0000 UTC m=+950.347462492" lastFinishedPulling="2025-12-03 19:11:10.603745018 +0000 UTC m=+991.202339482" observedRunningTime="2025-12-03 19:11:13.652379651 +0000 UTC m=+994.250974115" watchObservedRunningTime="2025-12-03 19:11:13.664214817 +0000 UTC m=+994.262809281" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.688694 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4rrp7" podStartSLOduration=18.321279752 podStartE2EDuration="31.688652181s" podCreationTimestamp="2025-12-03 19:10:42 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.343672282 +0000 UTC m=+976.942266756" lastFinishedPulling="2025-12-03 19:11:09.711044721 +0000 UTC m=+990.309639185" observedRunningTime="2025-12-03 19:11:13.688551048 +0000 UTC m=+994.287145512" watchObservedRunningTime="2025-12-03 19:11:13.688652181 +0000 UTC m=+994.287246655" Dec 03 19:11:13 crc kubenswrapper[4731]: I1203 19:11:13.736661 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.305812943 podStartE2EDuration="29.736631653s" podCreationTimestamp="2025-12-03 19:10:44 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.622816572 +0000 UTC m=+977.221411046" lastFinishedPulling="2025-12-03 19:11:13.053635262 +0000 UTC m=+993.652229756" observedRunningTime="2025-12-03 19:11:13.733373232 +0000 UTC m=+994.331967706" watchObservedRunningTime="2025-12-03 19:11:13.736631653 +0000 UTC m=+994.335226147" Dec 03 19:11:14 crc kubenswrapper[4731]: E1203 19:11:14.661186 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="d88aab08-1249-4391-b0e2-ec8f0704e7c3" Dec 03 19:11:15 crc kubenswrapper[4731]: I1203 19:11:15.665744 4731 generic.go:334] "Generic (PLEG): container finished" podID="2d74ba4a-c904-441f-871b-57c691c528e2" containerID="2225b927de81561237a5f69a2f608eb09ec90e0ea03ce2de662be4687bf5b28c" exitCode=0 Dec 03 19:11:15 crc kubenswrapper[4731]: I1203 19:11:15.665824 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d74ba4a-c904-441f-871b-57c691c528e2","Type":"ContainerDied","Data":"2225b927de81561237a5f69a2f608eb09ec90e0ea03ce2de662be4687bf5b28c"} Dec 03 19:11:15 crc kubenswrapper[4731]: I1203 19:11:15.667779 4731 generic.go:334] "Generic (PLEG): container finished" podID="d340d800-c6f0-4375-81ed-d993a19950dd" containerID="d741c5f180bbb245b2a448f57d42b27993fc00b23db645afbd41cecd7a795b59" exitCode=0 Dec 03 19:11:15 crc kubenswrapper[4731]: I1203 19:11:15.667808 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d340d800-c6f0-4375-81ed-d993a19950dd","Type":"ContainerDied","Data":"d741c5f180bbb245b2a448f57d42b27993fc00b23db645afbd41cecd7a795b59"} Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.263522 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.263986 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.331535 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.678001 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d74ba4a-c904-441f-871b-57c691c528e2","Type":"ContainerStarted","Data":"9e17242674b37a1ae53fad8e77695fc62b4b5262ca929cc0da4b888bcabc2973"} Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.679815 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d340d800-c6f0-4375-81ed-d993a19950dd","Type":"ContainerStarted","Data":"9d4af3755e3ff281a9b9e023476bbe54671d54aab0accf9e83e04e40b99452a8"} Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.708932 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.282666491 podStartE2EDuration="43.708899428s" podCreationTimestamp="2025-12-03 19:10:33 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.526167238 +0000 UTC m=+977.124761702" lastFinishedPulling="2025-12-03 19:11:09.952400175 +0000 UTC m=+990.550994639" observedRunningTime="2025-12-03 19:11:16.697014902 +0000 UTC m=+997.295609376" watchObservedRunningTime="2025-12-03 19:11:16.708899428 +0000 UTC m=+997.307493902" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.720981 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.722989 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.728451 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.617749198 podStartE2EDuration="41.728429331s" podCreationTimestamp="2025-12-03 19:10:35 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.513713703 +0000 UTC m=+977.112308167" lastFinishedPulling="2025-12-03 19:11:09.624393836 +0000 UTC m=+990.222988300" observedRunningTime="2025-12-03 19:11:16.722506129 +0000 UTC m=+997.321100593" watchObservedRunningTime="2025-12-03 19:11:16.728429331 +0000 UTC m=+997.327023795" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.731386 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 19:11:16 crc kubenswrapper[4731]: I1203 19:11:16.776834 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.026186 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.029332 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.032207 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.050306 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.134638 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-r76nc"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.135889 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.138170 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.144919 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r76nc"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.166523 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.166639 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr29p\" (UniqueName: \"kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.166721 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268195 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovs-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268281 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-combined-ca-bundle\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268310 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovn-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268337 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077da0f9-1e75-450b-b6b1-921c8ff9950b-config\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268364 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr978\" (UniqueName: \"kubernetes.io/projected/077da0f9-1e75-450b-b6b1-921c8ff9950b-kube-api-access-rr978\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268399 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268438 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr29p\" (UniqueName: \"kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268471 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.268488 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.269236 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.269267 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.292297 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr29p\" (UniqueName: \"kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p\") pod \"dnsmasq-dns-7f755d8cd7-x6nnz\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.317114 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.317433 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="dnsmasq-dns" containerID="cri-o://2077ff89c13eb358308e490050ddc1f16268e960c33165fadc297c005ec34f22" gracePeriod=10 Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.341623 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.343112 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.345692 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.354923 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.355319 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369409 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369753 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovs-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369794 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-combined-ca-bundle\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369818 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovn-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369845 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077da0f9-1e75-450b-b6b1-921c8ff9950b-config\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.369877 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr978\" (UniqueName: \"kubernetes.io/projected/077da0f9-1e75-450b-b6b1-921c8ff9950b-kube-api-access-rr978\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.370786 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovn-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.370786 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/077da0f9-1e75-450b-b6b1-921c8ff9950b-ovs-rundir\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.371430 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077da0f9-1e75-450b-b6b1-921c8ff9950b-config\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.375206 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-combined-ca-bundle\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.396483 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr978\" (UniqueName: \"kubernetes.io/projected/077da0f9-1e75-450b-b6b1-921c8ff9950b-kube-api-access-rr978\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.401771 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/077da0f9-1e75-450b-b6b1-921c8ff9950b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r76nc\" (UID: \"077da0f9-1e75-450b-b6b1-921c8ff9950b\") " pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.456070 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r76nc" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.472016 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.472094 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.472121 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.472185 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4p9\" (UniqueName: \"kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.577960 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.578275 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.578327 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4p9\" (UniqueName: \"kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.578408 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.579300 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.581216 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.581985 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.599287 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4p9\" (UniqueName: \"kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9\") pod \"dnsmasq-dns-5bc95b79f5-smj25\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.665480 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.689407 4731 generic.go:334] "Generic (PLEG): container finished" podID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerID="2077ff89c13eb358308e490050ddc1f16268e960c33165fadc297c005ec34f22" exitCode=0 Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.689597 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" event={"ID":"9f3bbd9e-0b1e-4199-b050-07d7d110e502","Type":"ContainerDied","Data":"2077ff89c13eb358308e490050ddc1f16268e960c33165fadc297c005ec34f22"} Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.907095 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.918191 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:17 crc kubenswrapper[4731]: W1203 19:11:17.925230 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f89089_2dc7_4130_8153_6e5516c727ee.slice/crio-e07d9b02560aa040f241306af2a72a0493b1155f2807140d4f8254fb2257a6f7 WatchSource:0}: Error finding container e07d9b02560aa040f241306af2a72a0493b1155f2807140d4f8254fb2257a6f7: Status 404 returned error can't find the container with id e07d9b02560aa040f241306af2a72a0493b1155f2807140d4f8254fb2257a6f7 Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.984618 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config\") pod \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.985076 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5fb8\" (UniqueName: \"kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8\") pod \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\" (UID: \"9f3bbd9e-0b1e-4199-b050-07d7d110e502\") " Dec 03 19:11:17 crc kubenswrapper[4731]: I1203 19:11:17.991374 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8" (OuterVolumeSpecName: "kube-api-access-n5fb8") pod "9f3bbd9e-0b1e-4199-b050-07d7d110e502" (UID: "9f3bbd9e-0b1e-4199-b050-07d7d110e502"). InnerVolumeSpecName "kube-api-access-n5fb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.047533 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config" (OuterVolumeSpecName: "config") pod "9f3bbd9e-0b1e-4199-b050-07d7d110e502" (UID: "9f3bbd9e-0b1e-4199-b050-07d7d110e502"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.063966 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r76nc"] Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.086596 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5fb8\" (UniqueName: \"kubernetes.io/projected/9f3bbd9e-0b1e-4199-b050-07d7d110e502-kube-api-access-n5fb8\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.086622 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3bbd9e-0b1e-4199-b050-07d7d110e502-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:18 crc kubenswrapper[4731]: W1203 19:11:18.329801 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d11f484_2fc0_41d0_bac8_24eb2d301146.slice/crio-72acb02cc94ed3f6ef8741f13994b5847cb8a9f4d7604f6534ae62bf5f4009ad WatchSource:0}: Error finding container 72acb02cc94ed3f6ef8741f13994b5847cb8a9f4d7604f6534ae62bf5f4009ad: Status 404 returned error can't find the container with id 72acb02cc94ed3f6ef8741f13994b5847cb8a9f4d7604f6534ae62bf5f4009ad Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.361654 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.698190 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r76nc" event={"ID":"077da0f9-1e75-450b-b6b1-921c8ff9950b","Type":"ContainerStarted","Data":"24fcb2388315f9e6624c677531c443805cfcd7cd19d043fa917a311261a62af8"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.698272 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r76nc" event={"ID":"077da0f9-1e75-450b-b6b1-921c8ff9950b","Type":"ContainerStarted","Data":"7fa6e1ed82e8c048c28dc3edb6027fa9d451441f5e642e7065b8562b49a4449d"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.701275 4731 generic.go:334] "Generic (PLEG): container finished" podID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerID="0cd55b53525e0f5d95d18acabfaf6ca0895b829ad75fd24de303502ce7b5f380" exitCode=0 Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.701338 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" event={"ID":"4d11f484-2fc0-41d0-bac8-24eb2d301146","Type":"ContainerDied","Data":"0cd55b53525e0f5d95d18acabfaf6ca0895b829ad75fd24de303502ce7b5f380"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.701366 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" event={"ID":"4d11f484-2fc0-41d0-bac8-24eb2d301146","Type":"ContainerStarted","Data":"72acb02cc94ed3f6ef8741f13994b5847cb8a9f4d7604f6534ae62bf5f4009ad"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.704759 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" event={"ID":"9f3bbd9e-0b1e-4199-b050-07d7d110e502","Type":"ContainerDied","Data":"ed5dca749417ad1eadc3873978388db8513ea564a687851f1b555a2bf3559295"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.704806 4731 scope.go:117] "RemoveContainer" containerID="2077ff89c13eb358308e490050ddc1f16268e960c33165fadc297c005ec34f22" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.704952 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nd6m6" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.708457 4731 generic.go:334] "Generic (PLEG): container finished" podID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerID="cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57" exitCode=0 Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.709456 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" event={"ID":"e2f89089-2dc7-4130-8153-6e5516c727ee","Type":"ContainerDied","Data":"cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.709547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" event={"ID":"e2f89089-2dc7-4130-8153-6e5516c727ee","Type":"ContainerStarted","Data":"e07d9b02560aa040f241306af2a72a0493b1155f2807140d4f8254fb2257a6f7"} Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.728435 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-r76nc" podStartSLOduration=1.728408631 podStartE2EDuration="1.728408631s" podCreationTimestamp="2025-12-03 19:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:18.719588579 +0000 UTC m=+999.318183063" watchObservedRunningTime="2025-12-03 19:11:18.728408631 +0000 UTC m=+999.327003095" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.810467 4731 scope.go:117] "RemoveContainer" containerID="c33c07bcf9c228b0c115d8a6fd6436f358c76979be950eb742dc8ff2b2237623" Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.859786 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:11:18 crc kubenswrapper[4731]: I1203 19:11:18.875353 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nd6m6"] Dec 03 19:11:19 crc kubenswrapper[4731]: I1203 19:11:19.867312 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" path="/var/lib/kubelet/pods/9f3bbd9e-0b1e-4199-b050-07d7d110e502/volumes" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.164038 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 19:11:20 crc kubenswrapper[4731]: E1203 19:11:20.164485 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="init" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.164515 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="init" Dec 03 19:11:20 crc kubenswrapper[4731]: E1203 19:11:20.164537 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="dnsmasq-dns" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.164547 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="dnsmasq-dns" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.164735 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3bbd9e-0b1e-4199-b050-07d7d110e502" containerName="dnsmasq-dns" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.169765 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.174439 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.174848 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.175276 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.175519 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gh5lq" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.188553 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.332403 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.332724 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.332830 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9zvl\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-kube-api-access-k9zvl\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.332938 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-lock\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.333100 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-cache\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.434863 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-lock\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.434940 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-cache\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.434986 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.435015 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.435049 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9zvl\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-kube-api-access-k9zvl\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: E1203 19:11:20.435568 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:20 crc kubenswrapper[4731]: E1203 19:11:20.435680 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:20 crc kubenswrapper[4731]: E1203 19:11:20.435809 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:20.935781436 +0000 UTC m=+1001.534375900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.435717 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.435721 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-lock\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.435688 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-cache\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.447754 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6vv75"] Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.449054 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.450755 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.450946 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.456738 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.457445 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9zvl\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-kube-api-access-k9zvl\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.512133 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.523791 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6vv75"] Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641182 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641519 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641670 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641783 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641873 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.641950 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.642025 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tjg\" (UniqueName: \"kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.758903 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tjg\" (UniqueName: \"kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759029 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759076 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759170 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759209 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759238 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759286 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.759867 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.760835 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.772926 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.774733 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.818484 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.825485 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tjg\" (UniqueName: \"kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:20 crc kubenswrapper[4731]: I1203 19:11:20.871196 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle\") pod \"swift-ring-rebalance-6vv75\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:21 crc kubenswrapper[4731]: I1203 19:11:20.983857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:21 crc kubenswrapper[4731]: E1203 19:11:20.984010 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:21 crc kubenswrapper[4731]: E1203 19:11:20.984046 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:21 crc kubenswrapper[4731]: E1203 19:11:20.984138 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:21.984096868 +0000 UTC m=+1002.582691332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:21 crc kubenswrapper[4731]: I1203 19:11:21.189553 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:22 crc kubenswrapper[4731]: I1203 19:11:22.023783 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:22 crc kubenswrapper[4731]: E1203 19:11:22.024085 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:22 crc kubenswrapper[4731]: E1203 19:11:22.025927 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:22 crc kubenswrapper[4731]: E1203 19:11:22.026077 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:24.026039634 +0000 UTC m=+1004.624634098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:22 crc kubenswrapper[4731]: I1203 19:11:22.336974 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6vv75"] Dec 03 19:11:22 crc kubenswrapper[4731]: I1203 19:11:22.861318 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6vv75" event={"ID":"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285","Type":"ContainerStarted","Data":"5154f743ed0adf554ede0f003d2e681fc5bcf52ffde84c720b8ffc22db71d16a"} Dec 03 19:11:24 crc kubenswrapper[4731]: I1203 19:11:24.118198 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:24 crc kubenswrapper[4731]: E1203 19:11:24.118430 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:24 crc kubenswrapper[4731]: E1203 19:11:24.118459 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:24 crc kubenswrapper[4731]: E1203 19:11:24.118520 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:28.1185021 +0000 UTC m=+1008.717096564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:25 crc kubenswrapper[4731]: I1203 19:11:25.636974 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 19:11:25 crc kubenswrapper[4731]: I1203 19:11:25.638384 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 19:11:25 crc kubenswrapper[4731]: I1203 19:11:25.891054 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" event={"ID":"e2f89089-2dc7-4130-8153-6e5516c727ee","Type":"ContainerStarted","Data":"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356"} Dec 03 19:11:25 crc kubenswrapper[4731]: I1203 19:11:25.893140 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" event={"ID":"4d11f484-2fc0-41d0-bac8-24eb2d301146","Type":"ContainerStarted","Data":"2b92f0683b098abbb47b060363eed7d671e07ab138a12ce302a1c66801c7f1ef"} Dec 03 19:11:26 crc kubenswrapper[4731]: I1203 19:11:26.901679 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:26 crc kubenswrapper[4731]: I1203 19:11:26.923512 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" podStartSLOduration=9.923467209 podStartE2EDuration="9.923467209s" podCreationTimestamp="2025-12-03 19:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:26.919219797 +0000 UTC m=+1007.517814261" watchObservedRunningTime="2025-12-03 19:11:26.923467209 +0000 UTC m=+1007.522061683" Dec 03 19:11:26 crc kubenswrapper[4731]: I1203 19:11:26.938366 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" podStartSLOduration=9.938347017 podStartE2EDuration="9.938347017s" podCreationTimestamp="2025-12-03 19:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:26.93422032 +0000 UTC m=+1007.532814804" watchObservedRunningTime="2025-12-03 19:11:26.938347017 +0000 UTC m=+1007.536941481" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.428951 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.541140 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.668599 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.921847 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s8zdm" event={"ID":"d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3","Type":"ContainerStarted","Data":"8528966dd997d2152fb9484321e00b314032a93fdb928fdd79041027916ee3df"} Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.922419 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s8zdm" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.924063 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a549a4d-e989-42bf-8e74-556e6feb9507","Type":"ContainerStarted","Data":"9f7592975c03bfb2b554ae6cd8fe2dcd239a395e0d58e02e9a8a319b14e61f0b"} Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.967923 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s8zdm" podStartSLOduration=15.717897908 podStartE2EDuration="45.967903491s" podCreationTimestamp="2025-12-03 19:10:42 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.513748254 +0000 UTC m=+977.112342708" lastFinishedPulling="2025-12-03 19:11:26.763753827 +0000 UTC m=+1007.362348291" observedRunningTime="2025-12-03 19:11:27.955051894 +0000 UTC m=+1008.553646378" watchObservedRunningTime="2025-12-03 19:11:27.967903491 +0000 UTC m=+1008.566497945" Dec 03 19:11:27 crc kubenswrapper[4731]: I1203 19:11:27.974925 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.432618759 podStartE2EDuration="49.974905297s" podCreationTimestamp="2025-12-03 19:10:38 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.50389801 +0000 UTC m=+977.102492474" lastFinishedPulling="2025-12-03 19:11:27.046184548 +0000 UTC m=+1007.644779012" observedRunningTime="2025-12-03 19:11:27.969036356 +0000 UTC m=+1008.567630830" watchObservedRunningTime="2025-12-03 19:11:27.974905297 +0000 UTC m=+1008.573499761" Dec 03 19:11:28 crc kubenswrapper[4731]: I1203 19:11:28.147114 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:28 crc kubenswrapper[4731]: E1203 19:11:28.147327 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:28 crc kubenswrapper[4731]: E1203 19:11:28.147358 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:28 crc kubenswrapper[4731]: E1203 19:11:28.147416 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:36.147394714 +0000 UTC m=+1016.745989178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:28 crc kubenswrapper[4731]: I1203 19:11:28.942436 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d88aab08-1249-4391-b0e2-ec8f0704e7c3","Type":"ContainerStarted","Data":"8d9c9b140b920c96518f80fa19747a2cb626fac96b0f0dc888dfffcd19ae906b"} Dec 03 19:11:28 crc kubenswrapper[4731]: I1203 19:11:28.976407 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.122181203 podStartE2EDuration="46.976380303s" podCreationTimestamp="2025-12-03 19:10:42 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.698581442 +0000 UTC m=+977.297175906" lastFinishedPulling="2025-12-03 19:11:27.552780542 +0000 UTC m=+1008.151375006" observedRunningTime="2025-12-03 19:11:28.966650483 +0000 UTC m=+1009.565244947" watchObservedRunningTime="2025-12-03 19:11:28.976380303 +0000 UTC m=+1009.574974767" Dec 03 19:11:28 crc kubenswrapper[4731]: I1203 19:11:28.990215 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 19:11:29 crc kubenswrapper[4731]: I1203 19:11:29.331103 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 19:11:29 crc kubenswrapper[4731]: I1203 19:11:29.331163 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 19:11:31 crc kubenswrapper[4731]: I1203 19:11:31.018800 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 19:11:31 crc kubenswrapper[4731]: I1203 19:11:31.104189 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 19:11:31 crc kubenswrapper[4731]: I1203 19:11:31.984367 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6vv75" event={"ID":"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285","Type":"ContainerStarted","Data":"b89ec29dd46411636e007d833373bc5812e5e464d4c6ce4431ade43776e2a602"} Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.008366 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6vv75" podStartSLOduration=2.987869215 podStartE2EDuration="12.008343791s" podCreationTimestamp="2025-12-03 19:11:20 +0000 UTC" firstStartedPulling="2025-12-03 19:11:22.357326554 +0000 UTC m=+1002.955921018" lastFinishedPulling="2025-12-03 19:11:31.37780112 +0000 UTC m=+1011.976395594" observedRunningTime="2025-12-03 19:11:32.004276566 +0000 UTC m=+1012.602871040" watchObservedRunningTime="2025-12-03 19:11:32.008343791 +0000 UTC m=+1012.606938255" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.358590 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.393559 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.477340 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.649574 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.654103 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.657498 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.658035 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.658400 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.658649 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x2xb6" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.666123 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.670099 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748340 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748384 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748407 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-scripts\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748426 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwjt5\" (UniqueName: \"kubernetes.io/projected/71768265-a5dd-4890-b3ff-349f6a1114fc-kube-api-access-zwjt5\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748458 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748513 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-config\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.748574 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.790180 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.850495 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-config\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.850852 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.850981 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.851065 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.851145 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-scripts\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.851217 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwjt5\" (UniqueName: \"kubernetes.io/projected/71768265-a5dd-4890-b3ff-349f6a1114fc-kube-api-access-zwjt5\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.851354 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.852007 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.852544 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-config\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.852753 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71768265-a5dd-4890-b3ff-349f6a1114fc-scripts\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.857773 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.871922 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwjt5\" (UniqueName: \"kubernetes.io/projected/71768265-a5dd-4890-b3ff-349f6a1114fc-kube-api-access-zwjt5\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.872561 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.890248 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71768265-a5dd-4890-b3ff-349f6a1114fc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"71768265-a5dd-4890-b3ff-349f6a1114fc\") " pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.978409 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 19:11:32 crc kubenswrapper[4731]: I1203 19:11:32.990096 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="dnsmasq-dns" containerID="cri-o://93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356" gracePeriod=10 Dec 03 19:11:33 crc kubenswrapper[4731]: W1203 19:11:33.674550 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71768265_a5dd_4890_b3ff_349f6a1114fc.slice/crio-b65be9e6407ab7e83f0632d4678855bc2c3d1cab312cf53e9a6869032f47215e WatchSource:0}: Error finding container b65be9e6407ab7e83f0632d4678855bc2c3d1cab312cf53e9a6869032f47215e: Status 404 returned error can't find the container with id b65be9e6407ab7e83f0632d4678855bc2c3d1cab312cf53e9a6869032f47215e Dec 03 19:11:33 crc kubenswrapper[4731]: I1203 19:11:33.684028 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 19:11:33 crc kubenswrapper[4731]: I1203 19:11:33.940427 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.006113 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71768265-a5dd-4890-b3ff-349f6a1114fc","Type":"ContainerStarted","Data":"b65be9e6407ab7e83f0632d4678855bc2c3d1cab312cf53e9a6869032f47215e"} Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.010115 4731 generic.go:334] "Generic (PLEG): container finished" podID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerID="93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356" exitCode=0 Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.010215 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.010215 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" event={"ID":"e2f89089-2dc7-4130-8153-6e5516c727ee","Type":"ContainerDied","Data":"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356"} Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.010507 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f755d8cd7-x6nnz" event={"ID":"e2f89089-2dc7-4130-8153-6e5516c727ee","Type":"ContainerDied","Data":"e07d9b02560aa040f241306af2a72a0493b1155f2807140d4f8254fb2257a6f7"} Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.010541 4731 scope.go:117] "RemoveContainer" containerID="93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.035166 4731 scope.go:117] "RemoveContainer" containerID="cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.055809 4731 scope.go:117] "RemoveContainer" containerID="93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356" Dec 03 19:11:34 crc kubenswrapper[4731]: E1203 19:11:34.056269 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356\": container with ID starting with 93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356 not found: ID does not exist" containerID="93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.056328 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356"} err="failed to get container status \"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356\": rpc error: code = NotFound desc = could not find container \"93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356\": container with ID starting with 93f3c4e774d69d792d42ec5ecfc8a5fbb37ac4d50cdf537a480cd1a2c7883356 not found: ID does not exist" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.056369 4731 scope.go:117] "RemoveContainer" containerID="cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57" Dec 03 19:11:34 crc kubenswrapper[4731]: E1203 19:11:34.056695 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57\": container with ID starting with cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57 not found: ID does not exist" containerID="cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.056737 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57"} err="failed to get container status \"cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57\": rpc error: code = NotFound desc = could not find container \"cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57\": container with ID starting with cf87d9b78b4ddde67aa00d32539040dfbe95f9390360155beedd37efa26f0e57 not found: ID does not exist" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.079784 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config\") pod \"e2f89089-2dc7-4130-8153-6e5516c727ee\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.079867 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr29p\" (UniqueName: \"kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p\") pod \"e2f89089-2dc7-4130-8153-6e5516c727ee\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.079911 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb\") pod \"e2f89089-2dc7-4130-8153-6e5516c727ee\" (UID: \"e2f89089-2dc7-4130-8153-6e5516c727ee\") " Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.087134 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p" (OuterVolumeSpecName: "kube-api-access-wr29p") pod "e2f89089-2dc7-4130-8153-6e5516c727ee" (UID: "e2f89089-2dc7-4130-8153-6e5516c727ee"). InnerVolumeSpecName "kube-api-access-wr29p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.123439 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2f89089-2dc7-4130-8153-6e5516c727ee" (UID: "e2f89089-2dc7-4130-8153-6e5516c727ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.125341 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config" (OuterVolumeSpecName: "config") pod "e2f89089-2dc7-4130-8153-6e5516c727ee" (UID: "e2f89089-2dc7-4130-8153-6e5516c727ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.182028 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.182427 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr29p\" (UniqueName: \"kubernetes.io/projected/e2f89089-2dc7-4130-8153-6e5516c727ee-kube-api-access-wr29p\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.182441 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f89089-2dc7-4130-8153-6e5516c727ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.355438 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:34 crc kubenswrapper[4731]: I1203 19:11:34.362161 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f755d8cd7-x6nnz"] Dec 03 19:11:35 crc kubenswrapper[4731]: I1203 19:11:35.228031 4731 generic.go:334] "Generic (PLEG): container finished" podID="469480bc-e167-4ecc-87c4-9691057d999f" containerID="58682561d8a5f5c7c84ae4f7bf28b58db8754a15808e79e7e8013b7ec94685b2" exitCode=0 Dec 03 19:11:35 crc kubenswrapper[4731]: I1203 19:11:35.228397 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerDied","Data":"58682561d8a5f5c7c84ae4f7bf28b58db8754a15808e79e7e8013b7ec94685b2"} Dec 03 19:11:35 crc kubenswrapper[4731]: I1203 19:11:35.867702 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" path="/var/lib/kubelet/pods/e2f89089-2dc7-4130-8153-6e5516c727ee/volumes" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.222876 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:36 crc kubenswrapper[4731]: E1203 19:11:36.223077 4731 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 19:11:36 crc kubenswrapper[4731]: E1203 19:11:36.223291 4731 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 19:11:36 crc kubenswrapper[4731]: E1203 19:11:36.223343 4731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift podName:8327126f-a2f3-4b2d-a5b3-118bfa1f41ce nodeName:}" failed. No retries permitted until 2025-12-03 19:11:52.223326472 +0000 UTC m=+1032.821920936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift") pod "swift-storage-0" (UID: "8327126f-a2f3-4b2d-a5b3-118bfa1f41ce") : configmap "swift-ring-files" not found Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.241774 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerStarted","Data":"98b24cea655e157dd1c4c37f068e846e36262777c12548713701f7a9b89f8f01"} Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.242003 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.243496 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71768265-a5dd-4890-b3ff-349f6a1114fc","Type":"ContainerStarted","Data":"0ab7d6d768d8e0269b829e7c3e0e3b973eaf93259aa60fbde6d1b0503137caf3"} Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.243520 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71768265-a5dd-4890-b3ff-349f6a1114fc","Type":"ContainerStarted","Data":"cdb30a5fd7e8c28e4c47a4977c6edf0a938a9d06d3741edd3cd9429f84f18e65"} Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.243694 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.302042 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=61.949659541 podStartE2EDuration="1m3.302023762s" podCreationTimestamp="2025-12-03 19:10:33 +0000 UTC" firstStartedPulling="2025-12-03 19:10:56.526804267 +0000 UTC m=+977.125398731" lastFinishedPulling="2025-12-03 19:10:57.879168488 +0000 UTC m=+978.477762952" observedRunningTime="2025-12-03 19:11:36.296388188 +0000 UTC m=+1016.894982652" watchObservedRunningTime="2025-12-03 19:11:36.302023762 +0000 UTC m=+1016.900618226" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.331701 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.679352304 podStartE2EDuration="4.331681998s" podCreationTimestamp="2025-12-03 19:11:32 +0000 UTC" firstStartedPulling="2025-12-03 19:11:33.677761144 +0000 UTC m=+1014.276355618" lastFinishedPulling="2025-12-03 19:11:35.330090848 +0000 UTC m=+1015.928685312" observedRunningTime="2025-12-03 19:11:36.329563863 +0000 UTC m=+1016.928158347" watchObservedRunningTime="2025-12-03 19:11:36.331681998 +0000 UTC m=+1016.930276462" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.484207 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2324-account-create-update-ll8nj"] Dec 03 19:11:36 crc kubenswrapper[4731]: E1203 19:11:36.484584 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="init" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.484599 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="init" Dec 03 19:11:36 crc kubenswrapper[4731]: E1203 19:11:36.484628 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="dnsmasq-dns" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.484634 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="dnsmasq-dns" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.484791 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f89089-2dc7-4130-8153-6e5516c727ee" containerName="dnsmasq-dns" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.485391 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.487445 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.505576 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2324-account-create-update-ll8nj"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.544799 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pbxzw"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.546068 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.552138 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pbxzw"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.630351 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97cpc\" (UniqueName: \"kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.630457 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.732017 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.732106 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpff\" (UniqueName: \"kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.732311 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97cpc\" (UniqueName: \"kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.732345 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.733385 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.834577 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tmrzz"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.837127 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.839837 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.839933 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpff\" (UniqueName: \"kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.841666 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.850796 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tmrzz"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.873241 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97cpc\" (UniqueName: \"kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc\") pod \"keystone-2324-account-create-update-ll8nj\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.969779 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl62p\" (UniqueName: \"kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.969934 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.978404 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d6b0-account-create-update-vkkvz"] Dec 03 19:11:36 crc kubenswrapper[4731]: I1203 19:11:36.984881 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:36.999050 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpff\" (UniqueName: \"kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff\") pod \"keystone-db-create-pbxzw\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:36.999140 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.015575 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6b0-account-create-update-vkkvz"] Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.072458 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.072575 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl62p\" (UniqueName: \"kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.073825 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.081879 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xbdsh"] Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.083623 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.090563 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xbdsh"] Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.097381 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6f66-account-create-update-8242w"] Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.098733 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.104897 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.108983 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl62p\" (UniqueName: \"kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p\") pod \"placement-db-create-tmrzz\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.109379 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.117668 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6f66-account-create-update-8242w"] Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.173891 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.174584 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdt44\" (UniqueName: \"kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.174819 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.276675 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.276742 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdt44\" (UniqueName: \"kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.277164 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.277218 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxgx\" (UniqueName: \"kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.277273 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs28n\" (UniqueName: \"kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.277780 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.278518 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.290711 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.294616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdt44\" (UniqueName: \"kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44\") pod \"placement-d6b0-account-create-update-vkkvz\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.338622 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.590321 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.590675 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.590741 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxgx\" (UniqueName: \"kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.590791 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs28n\" (UniqueName: \"kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.593946 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.594770 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.618507 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxgx\" (UniqueName: \"kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx\") pod \"glance-db-create-xbdsh\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:37 crc kubenswrapper[4731]: I1203 19:11:37.695301 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs28n\" (UniqueName: \"kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n\") pod \"glance-6f66-account-create-update-8242w\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:37.922498 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.085169 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.440847 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6b0-account-create-update-vkkvz"] Dec 03 19:11:38 crc kubenswrapper[4731]: W1203 19:11:38.453443 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode946b6b9_9dfd_43cc_ae7d_b9e6abcecd0f.slice/crio-7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546 WatchSource:0}: Error finding container 7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546: Status 404 returned error can't find the container with id 7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546 Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.732909 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pbxzw"] Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.927273 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tmrzz"] Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.949478 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2324-account-create-update-ll8nj"] Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.973471 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6f66-account-create-update-8242w"] Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.986425 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xbdsh"] Dec 03 19:11:38 crc kubenswrapper[4731]: I1203 19:11:38.992663 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 19:11:39 crc kubenswrapper[4731]: W1203 19:11:39.010654 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75c482c_bd09_47ec_b8e9_0f114cd19ccd.slice/crio-db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4 WatchSource:0}: Error finding container db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4: Status 404 returned error can't find the container with id db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4 Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.328643 4731 generic.go:334] "Generic (PLEG): container finished" podID="68165469-3496-4274-9d9b-f397ed17e5c9" containerID="24b65bd73fe70e2744072a56a4da18ef1c85cc63e3069337c79bfe7b53813d26" exitCode=0 Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.328881 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbxzw" event={"ID":"68165469-3496-4274-9d9b-f397ed17e5c9","Type":"ContainerDied","Data":"24b65bd73fe70e2744072a56a4da18ef1c85cc63e3069337c79bfe7b53813d26"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.329145 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbxzw" event={"ID":"68165469-3496-4274-9d9b-f397ed17e5c9","Type":"ContainerStarted","Data":"9ad2f7e2b759763b4a7efb80e8dd5de1deb10cbc267b2e4400ef13921ab0c3f3"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.331027 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2324-account-create-update-ll8nj" event={"ID":"4fe6a778-abea-4237-bdcb-2a9c5a8c2967","Type":"ContainerStarted","Data":"c762a6d3c3be6c5441edf98e4a0009a8c7f118e6ca51f92f51dbc9a5227464eb"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.331092 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2324-account-create-update-ll8nj" event={"ID":"4fe6a778-abea-4237-bdcb-2a9c5a8c2967","Type":"ContainerStarted","Data":"8834349aa5b9c30905afa7d40283b49e5a61238f6fb42f469fbdb020ecabe540"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.333611 4731 generic.go:334] "Generic (PLEG): container finished" podID="e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" containerID="31d2c33ff6fb28b9d0b317081d70932a9826f93ad1c9bde1a42583fb371dc4c4" exitCode=0 Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.333708 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6b0-account-create-update-vkkvz" event={"ID":"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f","Type":"ContainerDied","Data":"31d2c33ff6fb28b9d0b317081d70932a9826f93ad1c9bde1a42583fb371dc4c4"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.333748 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6b0-account-create-update-vkkvz" event={"ID":"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f","Type":"ContainerStarted","Data":"7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.338586 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tmrzz" event={"ID":"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef","Type":"ContainerStarted","Data":"2923cad73d61f1b745f3426e39b7df22c4384c9481fb1b24b46ccbdd136bbef5"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.338642 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tmrzz" event={"ID":"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef","Type":"ContainerStarted","Data":"148a11ae92895d1fc5d0921bced3de67286e9ee44f34bc37f7af870c9b3aadc6"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.344283 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6f66-account-create-update-8242w" event={"ID":"2497e019-571b-42b7-bec0-f1ea5d26c3e9","Type":"ContainerStarted","Data":"1f4eb078260b9ab9b528398b168e5a472464724dc91a27d1d038fec15864c4c2"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.344349 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6f66-account-create-update-8242w" event={"ID":"2497e019-571b-42b7-bec0-f1ea5d26c3e9","Type":"ContainerStarted","Data":"25dfe6d99917d299c99ae0bd972599cbf67d6960def7aab327539d5a77145a08"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.349125 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xbdsh" event={"ID":"c75c482c-bd09-47ec-b8e9-0f114cd19ccd","Type":"ContainerStarted","Data":"3e0605093657eb676597f03fae1fad86fadcbcb2d1b7e42904647a25c6f5779e"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.349161 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xbdsh" event={"ID":"c75c482c-bd09-47ec-b8e9-0f114cd19ccd","Type":"ContainerStarted","Data":"db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4"} Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.384738 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-xbdsh" podStartSLOduration=2.384713267 podStartE2EDuration="2.384713267s" podCreationTimestamp="2025-12-03 19:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:39.377177594 +0000 UTC m=+1019.975772078" watchObservedRunningTime="2025-12-03 19:11:39.384713267 +0000 UTC m=+1019.983307731" Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.427391 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2324-account-create-update-ll8nj" podStartSLOduration=3.427367734 podStartE2EDuration="3.427367734s" podCreationTimestamp="2025-12-03 19:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:39.425653431 +0000 UTC m=+1020.024247895" watchObservedRunningTime="2025-12-03 19:11:39.427367734 +0000 UTC m=+1020.025962208" Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.476228 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-tmrzz" podStartSLOduration=3.476188552 podStartE2EDuration="3.476188552s" podCreationTimestamp="2025-12-03 19:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:39.454520513 +0000 UTC m=+1020.053114987" watchObservedRunningTime="2025-12-03 19:11:39.476188552 +0000 UTC m=+1020.074783016" Dec 03 19:11:39 crc kubenswrapper[4731]: I1203 19:11:39.486943 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-6f66-account-create-update-8242w" podStartSLOduration=2.484438506 podStartE2EDuration="2.484438506s" podCreationTimestamp="2025-12-03 19:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:11:39.472647052 +0000 UTC m=+1020.071241516" watchObservedRunningTime="2025-12-03 19:11:39.484438506 +0000 UTC m=+1020.083032970" Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.359587 4731 generic.go:334] "Generic (PLEG): container finished" podID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerID="fc32c944255cdac00d14416c11b5771f9a9f1e781738b3debf50039e4d9a17fd" exitCode=0 Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.359676 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerDied","Data":"fc32c944255cdac00d14416c11b5771f9a9f1e781738b3debf50039e4d9a17fd"} Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.365471 4731 generic.go:334] "Generic (PLEG): container finished" podID="4fe6a778-abea-4237-bdcb-2a9c5a8c2967" containerID="c762a6d3c3be6c5441edf98e4a0009a8c7f118e6ca51f92f51dbc9a5227464eb" exitCode=0 Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.365637 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2324-account-create-update-ll8nj" event={"ID":"4fe6a778-abea-4237-bdcb-2a9c5a8c2967","Type":"ContainerDied","Data":"c762a6d3c3be6c5441edf98e4a0009a8c7f118e6ca51f92f51dbc9a5227464eb"} Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.369653 4731 generic.go:334] "Generic (PLEG): container finished" podID="949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" containerID="2923cad73d61f1b745f3426e39b7df22c4384c9481fb1b24b46ccbdd136bbef5" exitCode=0 Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.369796 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tmrzz" event={"ID":"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef","Type":"ContainerDied","Data":"2923cad73d61f1b745f3426e39b7df22c4384c9481fb1b24b46ccbdd136bbef5"} Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.372479 4731 generic.go:334] "Generic (PLEG): container finished" podID="2497e019-571b-42b7-bec0-f1ea5d26c3e9" containerID="1f4eb078260b9ab9b528398b168e5a472464724dc91a27d1d038fec15864c4c2" exitCode=0 Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.372547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6f66-account-create-update-8242w" event={"ID":"2497e019-571b-42b7-bec0-f1ea5d26c3e9","Type":"ContainerDied","Data":"1f4eb078260b9ab9b528398b168e5a472464724dc91a27d1d038fec15864c4c2"} Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.384951 4731 generic.go:334] "Generic (PLEG): container finished" podID="c75c482c-bd09-47ec-b8e9-0f114cd19ccd" containerID="3e0605093657eb676597f03fae1fad86fadcbcb2d1b7e42904647a25c6f5779e" exitCode=0 Dec 03 19:11:40 crc kubenswrapper[4731]: I1203 19:11:40.385328 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xbdsh" event={"ID":"c75c482c-bd09-47ec-b8e9-0f114cd19ccd","Type":"ContainerDied","Data":"3e0605093657eb676597f03fae1fad86fadcbcb2d1b7e42904647a25c6f5779e"} Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.008754 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.015012 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.163953 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts\") pod \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.164335 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdt44\" (UniqueName: \"kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44\") pod \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\" (UID: \"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.164997 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" (UID: "e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.165703 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfpff\" (UniqueName: \"kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff\") pod \"68165469-3496-4274-9d9b-f397ed17e5c9\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.165874 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts\") pod \"68165469-3496-4274-9d9b-f397ed17e5c9\" (UID: \"68165469-3496-4274-9d9b-f397ed17e5c9\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.166379 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68165469-3496-4274-9d9b-f397ed17e5c9" (UID: "68165469-3496-4274-9d9b-f397ed17e5c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.166602 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68165469-3496-4274-9d9b-f397ed17e5c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.166669 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.169693 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff" (OuterVolumeSpecName: "kube-api-access-zfpff") pod "68165469-3496-4274-9d9b-f397ed17e5c9" (UID: "68165469-3496-4274-9d9b-f397ed17e5c9"). InnerVolumeSpecName "kube-api-access-zfpff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.169926 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44" (OuterVolumeSpecName: "kube-api-access-cdt44") pod "e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" (UID: "e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f"). InnerVolumeSpecName "kube-api-access-cdt44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.268369 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdt44\" (UniqueName: \"kubernetes.io/projected/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f-kube-api-access-cdt44\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.268409 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfpff\" (UniqueName: \"kubernetes.io/projected/68165469-3496-4274-9d9b-f397ed17e5c9-kube-api-access-zfpff\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.395446 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerStarted","Data":"6c86f97930867d7e96e7d64a75e8e0731c393d4dc8cbab33295a96de1806f905"} Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.395684 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.397136 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pbxzw" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.397140 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pbxzw" event={"ID":"68165469-3496-4274-9d9b-f397ed17e5c9","Type":"ContainerDied","Data":"9ad2f7e2b759763b4a7efb80e8dd5de1deb10cbc267b2e4400ef13921ab0c3f3"} Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.397173 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad2f7e2b759763b4a7efb80e8dd5de1deb10cbc267b2e4400ef13921ab0c3f3" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.398861 4731 generic.go:334] "Generic (PLEG): container finished" podID="6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" containerID="b89ec29dd46411636e007d833373bc5812e5e464d4c6ce4431ade43776e2a602" exitCode=0 Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.398937 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6vv75" event={"ID":"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285","Type":"ContainerDied","Data":"b89ec29dd46411636e007d833373bc5812e5e464d4c6ce4431ade43776e2a602"} Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.400573 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6b0-account-create-update-vkkvz" event={"ID":"e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f","Type":"ContainerDied","Data":"7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546"} Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.400604 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e8a32eaeb74c440561fa361c0abb2786c38d54ee15b4f415610612c048cd546" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.400755 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6b0-account-create-update-vkkvz" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.437537 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=66.560066131 podStartE2EDuration="1m9.437516508s" podCreationTimestamp="2025-12-03 19:10:32 +0000 UTC" firstStartedPulling="2025-12-03 19:10:54.998596945 +0000 UTC m=+975.597191409" lastFinishedPulling="2025-12-03 19:10:57.876047292 +0000 UTC m=+978.474641786" observedRunningTime="2025-12-03 19:11:41.427454568 +0000 UTC m=+1022.026049032" watchObservedRunningTime="2025-12-03 19:11:41.437516508 +0000 UTC m=+1022.036110972" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.865493 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.984683 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts\") pod \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.984790 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl62p\" (UniqueName: \"kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p\") pod \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\" (UID: \"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef\") " Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.985405 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" (UID: "949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:41 crc kubenswrapper[4731]: I1203 19:11:41.990322 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p" (OuterVolumeSpecName: "kube-api-access-wl62p") pod "949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" (UID: "949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef"). InnerVolumeSpecName "kube-api-access-wl62p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.037463 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.046384 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.058826 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.086957 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl62p\" (UniqueName: \"kubernetes.io/projected/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-kube-api-access-wl62p\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.087001 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.188394 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts\") pod \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.188680 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts\") pod \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.188838 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts\") pod \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.188944 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fe6a778-abea-4237-bdcb-2a9c5a8c2967" (UID: "4fe6a778-abea-4237-bdcb-2a9c5a8c2967"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.189045 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs28n\" (UniqueName: \"kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n\") pod \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\" (UID: \"2497e019-571b-42b7-bec0-f1ea5d26c3e9\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.189150 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97cpc\" (UniqueName: \"kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc\") pod \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\" (UID: \"4fe6a778-abea-4237-bdcb-2a9c5a8c2967\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.189284 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kxgx\" (UniqueName: \"kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx\") pod \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\" (UID: \"c75c482c-bd09-47ec-b8e9-0f114cd19ccd\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.189305 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c75c482c-bd09-47ec-b8e9-0f114cd19ccd" (UID: "c75c482c-bd09-47ec-b8e9-0f114cd19ccd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.189296 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2497e019-571b-42b7-bec0-f1ea5d26c3e9" (UID: "2497e019-571b-42b7-bec0-f1ea5d26c3e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.190014 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.192561 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n" (OuterVolumeSpecName: "kube-api-access-bs28n") pod "2497e019-571b-42b7-bec0-f1ea5d26c3e9" (UID: "2497e019-571b-42b7-bec0-f1ea5d26c3e9"). InnerVolumeSpecName "kube-api-access-bs28n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.193119 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc" (OuterVolumeSpecName: "kube-api-access-97cpc") pod "4fe6a778-abea-4237-bdcb-2a9c5a8c2967" (UID: "4fe6a778-abea-4237-bdcb-2a9c5a8c2967"). InnerVolumeSpecName "kube-api-access-97cpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.193619 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx" (OuterVolumeSpecName: "kube-api-access-4kxgx") pod "c75c482c-bd09-47ec-b8e9-0f114cd19ccd" (UID: "c75c482c-bd09-47ec-b8e9-0f114cd19ccd"). InnerVolumeSpecName "kube-api-access-4kxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.291662 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2497e019-571b-42b7-bec0-f1ea5d26c3e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.291712 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.291736 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs28n\" (UniqueName: \"kubernetes.io/projected/2497e019-571b-42b7-bec0-f1ea5d26c3e9-kube-api-access-bs28n\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.291749 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97cpc\" (UniqueName: \"kubernetes.io/projected/4fe6a778-abea-4237-bdcb-2a9c5a8c2967-kube-api-access-97cpc\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.291759 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kxgx\" (UniqueName: \"kubernetes.io/projected/c75c482c-bd09-47ec-b8e9-0f114cd19ccd-kube-api-access-4kxgx\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.419484 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tmrzz" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.419766 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tmrzz" event={"ID":"949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef","Type":"ContainerDied","Data":"148a11ae92895d1fc5d0921bced3de67286e9ee44f34bc37f7af870c9b3aadc6"} Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.419876 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="148a11ae92895d1fc5d0921bced3de67286e9ee44f34bc37f7af870c9b3aadc6" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.423190 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6f66-account-create-update-8242w" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.423192 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6f66-account-create-update-8242w" event={"ID":"2497e019-571b-42b7-bec0-f1ea5d26c3e9","Type":"ContainerDied","Data":"25dfe6d99917d299c99ae0bd972599cbf67d6960def7aab327539d5a77145a08"} Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.423321 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dfe6d99917d299c99ae0bd972599cbf67d6960def7aab327539d5a77145a08" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.425933 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xbdsh" event={"ID":"c75c482c-bd09-47ec-b8e9-0f114cd19ccd","Type":"ContainerDied","Data":"db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4"} Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.425994 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db4205b7ebaa0241846bcf7978227de34a00913a92c426d438a4e2b93d94c3a4" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.425954 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xbdsh" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.428468 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2324-account-create-update-ll8nj" event={"ID":"4fe6a778-abea-4237-bdcb-2a9c5a8c2967","Type":"ContainerDied","Data":"8834349aa5b9c30905afa7d40283b49e5a61238f6fb42f469fbdb020ecabe540"} Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.428509 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8834349aa5b9c30905afa7d40283b49e5a61238f6fb42f469fbdb020ecabe540" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.428552 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2324-account-create-update-ll8nj" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.801475 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901128 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901218 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901351 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901449 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901514 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901603 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2tjg\" (UniqueName: \"kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901627 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle\") pod \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\" (UID: \"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285\") " Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.901896 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.902432 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.905950 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg" (OuterVolumeSpecName: "kube-api-access-c2tjg") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "kube-api-access-c2tjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.908737 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.922648 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts" (OuterVolumeSpecName: "scripts") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.924785 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:11:42 crc kubenswrapper[4731]: I1203 19:11:42.930798 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" (UID: "6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003401 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003442 4731 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003452 4731 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003461 4731 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003471 4731 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003483 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.003493 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2tjg\" (UniqueName: \"kubernetes.io/projected/6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285-kube-api-access-c2tjg\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.438900 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6vv75" event={"ID":"6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285","Type":"ContainerDied","Data":"5154f743ed0adf554ede0f003d2e681fc5bcf52ffde84c720b8ffc22db71d16a"} Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.438953 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5154f743ed0adf554ede0f003d2e681fc5bcf52ffde84c720b8ffc22db71d16a" Dec 03 19:11:43 crc kubenswrapper[4731]: I1203 19:11:43.439695 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6vv75" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.356068 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s4dgb"] Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358116 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75c482c-bd09-47ec-b8e9-0f114cd19ccd" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.358197 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75c482c-bd09-47ec-b8e9-0f114cd19ccd" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358325 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.358402 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358494 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68165469-3496-4274-9d9b-f397ed17e5c9" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.358574 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="68165469-3496-4274-9d9b-f397ed17e5c9" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358659 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.358722 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358778 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2497e019-571b-42b7-bec0-f1ea5d26c3e9" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.358890 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2497e019-571b-42b7-bec0-f1ea5d26c3e9" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.358950 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe6a778-abea-4237-bdcb-2a9c5a8c2967" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359016 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe6a778-abea-4237-bdcb-2a9c5a8c2967" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: E1203 19:11:47.359078 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" containerName="swift-ring-rebalance" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359129 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" containerName="swift-ring-rebalance" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359364 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2497e019-571b-42b7-bec0-f1ea5d26c3e9" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359437 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285" containerName="swift-ring-rebalance" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359503 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359654 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe6a778-abea-4237-bdcb-2a9c5a8c2967" containerName="mariadb-account-create-update" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359740 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359794 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="68165469-3496-4274-9d9b-f397ed17e5c9" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.359848 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75c482c-bd09-47ec-b8e9-0f114cd19ccd" containerName="mariadb-database-create" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.360478 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.364677 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.364678 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-txrr7" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.381834 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s4dgb"] Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.467470 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.475530 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4rrp7" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.510525 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.510622 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf22b\" (UniqueName: \"kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.510744 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.510777 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.612357 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.612412 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.612508 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.612542 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf22b\" (UniqueName: \"kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.634337 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.639787 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf22b\" (UniqueName: \"kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.643085 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.643108 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle\") pod \"glance-db-sync-s4dgb\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.686019 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s4dgb" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.844038 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s8zdm-config-799hm"] Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.845473 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.849534 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.893833 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s8zdm-config-799hm"] Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.924377 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8mn\" (UniqueName: \"kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.924457 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.924606 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.924782 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.924840 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:47 crc kubenswrapper[4731]: I1203 19:11:47.925037 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027316 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8mn\" (UniqueName: \"kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027628 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027659 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027735 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027762 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.027791 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.028605 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.028794 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.028831 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.028840 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.030570 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.056447 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.058642 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8mn\" (UniqueName: \"kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn\") pod \"ovn-controller-s8zdm-config-799hm\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.210238 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.363537 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s4dgb"] Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.484983 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s4dgb" event={"ID":"7357e9a7-ce03-47ff-a1a5-55b8d1280d31","Type":"ContainerStarted","Data":"b9bcc0fe85bffef16879bb3bfadd22d05b032722e4af06f13cc0b9610f318c0d"} Dec 03 19:11:48 crc kubenswrapper[4731]: I1203 19:11:48.716601 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s8zdm-config-799hm"] Dec 03 19:11:49 crc kubenswrapper[4731]: I1203 19:11:49.500736 4731 generic.go:334] "Generic (PLEG): container finished" podID="48ceea9e-79b9-443d-aae3-bc266394b602" containerID="01710f631071c8907e484cd17ba7623a49c9b45ef08b12efd0a650d841207ff8" exitCode=0 Dec 03 19:11:49 crc kubenswrapper[4731]: I1203 19:11:49.500846 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s8zdm-config-799hm" event={"ID":"48ceea9e-79b9-443d-aae3-bc266394b602","Type":"ContainerDied","Data":"01710f631071c8907e484cd17ba7623a49c9b45ef08b12efd0a650d841207ff8"} Dec 03 19:11:49 crc kubenswrapper[4731]: I1203 19:11:49.501050 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s8zdm-config-799hm" event={"ID":"48ceea9e-79b9-443d-aae3-bc266394b602","Type":"ContainerStarted","Data":"ae1d77272793b60cd861f932e7d313dbc6d3cd6608dad28867d4a436b7473997"} Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.021953 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090080 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090159 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090268 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090295 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090395 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090503 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090562 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090596 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj8mn\" (UniqueName: \"kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn\") pod \"48ceea9e-79b9-443d-aae3-bc266394b602\" (UID: \"48ceea9e-79b9-443d-aae3-bc266394b602\") " Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.090573 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run" (OuterVolumeSpecName: "var-run") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.091355 4731 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.091377 4731 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.091390 4731 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/48ceea9e-79b9-443d-aae3-bc266394b602-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.091412 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.092215 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts" (OuterVolumeSpecName: "scripts") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.107926 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn" (OuterVolumeSpecName: "kube-api-access-hj8mn") pod "48ceea9e-79b9-443d-aae3-bc266394b602" (UID: "48ceea9e-79b9-443d-aae3-bc266394b602"). InnerVolumeSpecName "kube-api-access-hj8mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.193525 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj8mn\" (UniqueName: \"kubernetes.io/projected/48ceea9e-79b9-443d-aae3-bc266394b602-kube-api-access-hj8mn\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.193571 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.193582 4731 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/48ceea9e-79b9-443d-aae3-bc266394b602-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.516466 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s8zdm-config-799hm" event={"ID":"48ceea9e-79b9-443d-aae3-bc266394b602","Type":"ContainerDied","Data":"ae1d77272793b60cd861f932e7d313dbc6d3cd6608dad28867d4a436b7473997"} Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.516760 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1d77272793b60cd861f932e7d313dbc6d3cd6608dad28867d4a436b7473997" Dec 03 19:11:51 crc kubenswrapper[4731]: I1203 19:11:51.516578 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s8zdm-config-799hm" Dec 03 19:11:52 crc kubenswrapper[4731]: I1203 19:11:52.131374 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s8zdm-config-799hm"] Dec 03 19:11:52 crc kubenswrapper[4731]: I1203 19:11:52.137967 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s8zdm-config-799hm"] Dec 03 19:11:52 crc kubenswrapper[4731]: I1203 19:11:52.313292 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:52 crc kubenswrapper[4731]: I1203 19:11:52.318468 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8327126f-a2f3-4b2d-a5b3-118bfa1f41ce-etc-swift\") pod \"swift-storage-0\" (UID: \"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce\") " pod="openstack/swift-storage-0" Dec 03 19:11:52 crc kubenswrapper[4731]: I1203 19:11:52.587693 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 19:11:53 crc kubenswrapper[4731]: I1203 19:11:53.353984 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 19:11:53 crc kubenswrapper[4731]: W1203 19:11:53.372712 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8327126f_a2f3_4b2d_a5b3_118bfa1f41ce.slice/crio-90ba384e12fe82cad1a793125ad93c423f8d11ebb1bed007c23643baea51a1b7 WatchSource:0}: Error finding container 90ba384e12fe82cad1a793125ad93c423f8d11ebb1bed007c23643baea51a1b7: Status 404 returned error can't find the container with id 90ba384e12fe82cad1a793125ad93c423f8d11ebb1bed007c23643baea51a1b7 Dec 03 19:11:53 crc kubenswrapper[4731]: I1203 19:11:53.532357 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"90ba384e12fe82cad1a793125ad93c423f8d11ebb1bed007c23643baea51a1b7"} Dec 03 19:11:53 crc kubenswrapper[4731]: I1203 19:11:53.869568 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ceea9e-79b9-443d-aae3-bc266394b602" path="/var/lib/kubelet/pods/48ceea9e-79b9-443d-aae3-bc266394b602/volumes" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.143465 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.460483 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.477548 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k9zst"] Dec 03 19:11:54 crc kubenswrapper[4731]: E1203 19:11:54.477962 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ceea9e-79b9-443d-aae3-bc266394b602" containerName="ovn-config" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.477974 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ceea9e-79b9-443d-aae3-bc266394b602" containerName="ovn-config" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.478142 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ceea9e-79b9-443d-aae3-bc266394b602" containerName="ovn-config" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.478816 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.506981 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9zst"] Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.661522 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4dg\" (UniqueName: \"kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.663523 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.680393 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jpsqh"] Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.714889 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.720190 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jpsqh"] Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.765672 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.765757 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.765812 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wt5c\" (UniqueName: \"kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.765856 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4dg\" (UniqueName: \"kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.767312 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.785492 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4dg\" (UniqueName: \"kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg\") pod \"cinder-db-create-k9zst\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.818010 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9zst" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.855273 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c7rr9"] Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.856227 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.859185 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.859385 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.859560 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rlxdn" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.859729 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.867588 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.867649 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wt5c\" (UniqueName: \"kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.868728 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.969453 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.969543 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwh4w\" (UniqueName: \"kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:54 crc kubenswrapper[4731]: I1203 19:11:54.969740 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.028390 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c7rr9"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.041730 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wt5c\" (UniqueName: \"kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c\") pod \"barbican-db-create-jpsqh\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.061723 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ab2e-account-create-update-h8wtr"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.062971 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.134032 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.134155 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.134211 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwh4w\" (UniqueName: \"kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.135940 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.147987 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.148550 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.196786 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ab2e-account-create-update-h8wtr"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.230135 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwh4w\" (UniqueName: \"kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w\") pod \"keystone-db-sync-c7rr9\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.235474 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.235548 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p59mt\" (UniqueName: \"kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.242916 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.309548 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mjcmb"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.310610 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.335244 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jpsqh" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.337891 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.337953 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p59mt\" (UniqueName: \"kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.339051 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.385088 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mjcmb"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.402220 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p59mt\" (UniqueName: \"kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt\") pod \"barbican-ab2e-account-create-update-h8wtr\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.439554 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.439678 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8c9\" (UniqueName: \"kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.485593 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-69a9-account-create-update-b4j6g"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.487879 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.493112 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.538039 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6755-account-create-update-cc66g"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.539203 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.551884 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6755-account-create-update-cc66g"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.552530 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.565843 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.565920 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwktg\" (UniqueName: \"kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.565958 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.565983 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8c9\" (UniqueName: \"kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.566014 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.566046 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7ms\" (UniqueName: \"kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.566187 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.567485 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.586807 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8c9\" (UniqueName: \"kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9\") pod \"neutron-db-create-mjcmb\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.622561 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-69a9-account-create-update-b4j6g"] Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.627836 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mjcmb" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.667636 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.667736 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7ms\" (UniqueName: \"kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.667850 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwktg\" (UniqueName: \"kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.667898 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.668884 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.669245 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.700279 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwktg\" (UniqueName: \"kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg\") pod \"neutron-6755-account-create-update-cc66g\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.701130 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7ms\" (UniqueName: \"kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms\") pod \"cinder-69a9-account-create-update-b4j6g\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.862524 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:11:55 crc kubenswrapper[4731]: I1203 19:11:55.864554 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:11:57 crc kubenswrapper[4731]: I1203 19:11:57.414907 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s8zdm" Dec 03 19:12:07 crc kubenswrapper[4731]: E1203 19:12:07.234225 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 03 19:12:07 crc kubenswrapper[4731]: E1203 19:12:07.235119 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gf22b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-s4dgb_openstack(7357e9a7-ce03-47ff-a1a5-55b8d1280d31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:07 crc kubenswrapper[4731]: E1203 19:12:07.236335 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-s4dgb" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" Dec 03 19:12:07 crc kubenswrapper[4731]: E1203 19:12:07.920328 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-s4dgb" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.003594 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c7rr9"] Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.203054 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-69a9-account-create-update-b4j6g"] Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.216752 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6755-account-create-update-cc66g"] Dec 03 19:12:08 crc kubenswrapper[4731]: W1203 19:12:08.217565 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4edeffd1_c6df_4546_8e0a_419f4ee657e8.slice/crio-37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195 WatchSource:0}: Error finding container 37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195: Status 404 returned error can't find the container with id 37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195 Dec 03 19:12:08 crc kubenswrapper[4731]: W1203 19:12:08.219889 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcf45b3a_a5e8_4958_86e7_734c6e3f9092.slice/crio-c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c WatchSource:0}: Error finding container c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c: Status 404 returned error can't find the container with id c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c Dec 03 19:12:08 crc kubenswrapper[4731]: W1203 19:12:08.221824 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3930526_a0aa_4712_bb01_762d53e5cdba.slice/crio-54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff WatchSource:0}: Error finding container 54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff: Status 404 returned error can't find the container with id 54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.231059 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mjcmb"] Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.238475 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jpsqh"] Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.378633 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ab2e-account-create-update-h8wtr"] Dec 03 19:12:08 crc kubenswrapper[4731]: W1203 19:12:08.385128 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29ced43a_3e30_4722_90ac_7b1184354703.slice/crio-4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c WatchSource:0}: Error finding container 4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c: Status 404 returned error can't find the container with id 4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.387437 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9zst"] Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.918526 4731 generic.go:334] "Generic (PLEG): container finished" podID="d3a23e91-45e1-4e8f-84b5-0787b2198d00" containerID="2acaab0662167e5d49ff9a22f2a9d1ac7ee085001fa9b2703951bce55946bad3" exitCode=0 Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.918732 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mjcmb" event={"ID":"d3a23e91-45e1-4e8f-84b5-0787b2198d00","Type":"ContainerDied","Data":"2acaab0662167e5d49ff9a22f2a9d1ac7ee085001fa9b2703951bce55946bad3"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.918960 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mjcmb" event={"ID":"d3a23e91-45e1-4e8f-84b5-0787b2198d00","Type":"ContainerStarted","Data":"4fce46d4086ec9c5d33477e36677a52387f7e9ef80c80f04b2e6e4268f14b697"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.921427 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ab2e-account-create-update-h8wtr" event={"ID":"1c398f19-6cd0-4998-ba26-d993cbee31e4","Type":"ContainerStarted","Data":"2269d158aff36f00d0c2d4666ee3154868b2e8cdb219cbe8690e0b96dbd35671"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.921466 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ab2e-account-create-update-h8wtr" event={"ID":"1c398f19-6cd0-4998-ba26-d993cbee31e4","Type":"ContainerStarted","Data":"49280443d52ea27178f08a32f0d3c3cbc9bbf3b93d97ebc9af6c6f02f2030488"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.923702 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9zst" event={"ID":"29ced43a-3e30-4722-90ac-7b1184354703","Type":"ContainerStarted","Data":"93704543c38acbaaee5dcfe5ccc6aa5273a735ab6dd5fe3aed3e8434f7e2b842"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.923757 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9zst" event={"ID":"29ced43a-3e30-4722-90ac-7b1184354703","Type":"ContainerStarted","Data":"4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.929854 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6755-account-create-update-cc66g" event={"ID":"4edeffd1-c6df-4546-8e0a-419f4ee657e8","Type":"ContainerStarted","Data":"4be66175609a3af1ee882ab7e54ac9a01e4e928de5f74fa191131ebabe8fb1a5"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.929893 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6755-account-create-update-cc66g" event={"ID":"4edeffd1-c6df-4546-8e0a-419f4ee657e8","Type":"ContainerStarted","Data":"37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.939691 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jpsqh" event={"ID":"c3930526-a0aa-4712-bb01-762d53e5cdba","Type":"ContainerStarted","Data":"ec01825e4d86bf21c3a713dacdee94ab0dca8e9523858441854213da22052130"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.939747 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jpsqh" event={"ID":"c3930526-a0aa-4712-bb01-762d53e5cdba","Type":"ContainerStarted","Data":"54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.946637 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7rr9" event={"ID":"9705a938-570d-4016-be7e-60a1ed1ed1cc","Type":"ContainerStarted","Data":"b4a7d2a822edf1ab628dab0d4898cf8f8c3efd89692e3883117332f8dc0dcc93"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.955498 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-69a9-account-create-update-b4j6g" event={"ID":"dcf45b3a-a5e8-4958-86e7-734c6e3f9092","Type":"ContainerStarted","Data":"198f03e217ed599555ada706b912c4a919dfebb82dd1c70cd5685b5924e9683d"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.955560 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-69a9-account-create-update-b4j6g" event={"ID":"dcf45b3a-a5e8-4958-86e7-734c6e3f9092","Type":"ContainerStarted","Data":"c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.961107 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"c4351ed44323231e606bd96dca7deeef95e18eeab7fcb4713e6fce45d616ec00"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.961190 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"6f8643e535e216a476acee492585052fb51147a24e2e9ee883c157ce008cdce6"} Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.975024 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-k9zst" podStartSLOduration=14.974964226 podStartE2EDuration="14.974964226s" podCreationTimestamp="2025-12-03 19:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:08.950493261 +0000 UTC m=+1049.549087755" watchObservedRunningTime="2025-12-03 19:12:08.974964226 +0000 UTC m=+1049.573558690" Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.979464 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-ab2e-account-create-update-h8wtr" podStartSLOduration=13.979450564 podStartE2EDuration="13.979450564s" podCreationTimestamp="2025-12-03 19:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:08.968355422 +0000 UTC m=+1049.566949896" watchObservedRunningTime="2025-12-03 19:12:08.979450564 +0000 UTC m=+1049.578045028" Dec 03 19:12:08 crc kubenswrapper[4731]: I1203 19:12:08.994015 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6755-account-create-update-cc66g" podStartSLOduration=13.993989693 podStartE2EDuration="13.993989693s" podCreationTimestamp="2025-12-03 19:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:08.987957027 +0000 UTC m=+1049.586551491" watchObservedRunningTime="2025-12-03 19:12:08.993989693 +0000 UTC m=+1049.592584157" Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.010678 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-69a9-account-create-update-b4j6g" podStartSLOduration=14.010657228 podStartE2EDuration="14.010657228s" podCreationTimestamp="2025-12-03 19:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:09.002054942 +0000 UTC m=+1049.600649406" watchObservedRunningTime="2025-12-03 19:12:09.010657228 +0000 UTC m=+1049.609251692" Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.971976 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"ffe32dad5adb48eedc84f8c21500a8f62146f02111f4f9cd0654ba27f2ab559e"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.972269 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"c66664d51ea49c20579e5cdf11151dd2d672f70d6b7e5ecdbf86bd7f77c7b121"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.974676 4731 generic.go:334] "Generic (PLEG): container finished" podID="1c398f19-6cd0-4998-ba26-d993cbee31e4" containerID="2269d158aff36f00d0c2d4666ee3154868b2e8cdb219cbe8690e0b96dbd35671" exitCode=0 Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.974731 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ab2e-account-create-update-h8wtr" event={"ID":"1c398f19-6cd0-4998-ba26-d993cbee31e4","Type":"ContainerDied","Data":"2269d158aff36f00d0c2d4666ee3154868b2e8cdb219cbe8690e0b96dbd35671"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.976472 4731 generic.go:334] "Generic (PLEG): container finished" podID="29ced43a-3e30-4722-90ac-7b1184354703" containerID="93704543c38acbaaee5dcfe5ccc6aa5273a735ab6dd5fe3aed3e8434f7e2b842" exitCode=0 Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.976534 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9zst" event={"ID":"29ced43a-3e30-4722-90ac-7b1184354703","Type":"ContainerDied","Data":"93704543c38acbaaee5dcfe5ccc6aa5273a735ab6dd5fe3aed3e8434f7e2b842"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.978325 4731 generic.go:334] "Generic (PLEG): container finished" podID="4edeffd1-c6df-4546-8e0a-419f4ee657e8" containerID="4be66175609a3af1ee882ab7e54ac9a01e4e928de5f74fa191131ebabe8fb1a5" exitCode=0 Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.978418 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6755-account-create-update-cc66g" event={"ID":"4edeffd1-c6df-4546-8e0a-419f4ee657e8","Type":"ContainerDied","Data":"4be66175609a3af1ee882ab7e54ac9a01e4e928de5f74fa191131ebabe8fb1a5"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.980951 4731 generic.go:334] "Generic (PLEG): container finished" podID="c3930526-a0aa-4712-bb01-762d53e5cdba" containerID="ec01825e4d86bf21c3a713dacdee94ab0dca8e9523858441854213da22052130" exitCode=0 Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.981015 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jpsqh" event={"ID":"c3930526-a0aa-4712-bb01-762d53e5cdba","Type":"ContainerDied","Data":"ec01825e4d86bf21c3a713dacdee94ab0dca8e9523858441854213da22052130"} Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.983774 4731 generic.go:334] "Generic (PLEG): container finished" podID="dcf45b3a-a5e8-4958-86e7-734c6e3f9092" containerID="198f03e217ed599555ada706b912c4a919dfebb82dd1c70cd5685b5924e9683d" exitCode=0 Dec 03 19:12:09 crc kubenswrapper[4731]: I1203 19:12:09.983945 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-69a9-account-create-update-b4j6g" event={"ID":"dcf45b3a-a5e8-4958-86e7-734c6e3f9092","Type":"ContainerDied","Data":"198f03e217ed599555ada706b912c4a919dfebb82dd1c70cd5685b5924e9683d"} Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.220081 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-69a9-account-create-update-b4j6g" event={"ID":"dcf45b3a-a5e8-4958-86e7-734c6e3f9092","Type":"ContainerDied","Data":"c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c"} Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.220950 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7008d760586f765f7b2a88c1026f5885490cce43e9020034db20dc42dcc638c" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.221386 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ab2e-account-create-update-h8wtr" event={"ID":"1c398f19-6cd0-4998-ba26-d993cbee31e4","Type":"ContainerDied","Data":"49280443d52ea27178f08a32f0d3c3cbc9bbf3b93d97ebc9af6c6f02f2030488"} Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.221403 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49280443d52ea27178f08a32f0d3c3cbc9bbf3b93d97ebc9af6c6f02f2030488" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.250341 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.258231 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.287980 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts\") pod \"1c398f19-6cd0-4998-ba26-d993cbee31e4\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.288050 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts\") pod \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.288098 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv7ms\" (UniqueName: \"kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms\") pod \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\" (UID: \"dcf45b3a-a5e8-4958-86e7-734c6e3f9092\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.288513 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p59mt\" (UniqueName: \"kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt\") pod \"1c398f19-6cd0-4998-ba26-d993cbee31e4\" (UID: \"1c398f19-6cd0-4998-ba26-d993cbee31e4\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.288584 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcf45b3a-a5e8-4958-86e7-734c6e3f9092" (UID: "dcf45b3a-a5e8-4958-86e7-734c6e3f9092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.288716 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c398f19-6cd0-4998-ba26-d993cbee31e4" (UID: "1c398f19-6cd0-4998-ba26-d993cbee31e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.289163 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c398f19-6cd0-4998-ba26-d993cbee31e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.289188 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.316151 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms" (OuterVolumeSpecName: "kube-api-access-fv7ms") pod "dcf45b3a-a5e8-4958-86e7-734c6e3f9092" (UID: "dcf45b3a-a5e8-4958-86e7-734c6e3f9092"). InnerVolumeSpecName "kube-api-access-fv7ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.316461 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt" (OuterVolumeSpecName: "kube-api-access-p59mt") pod "1c398f19-6cd0-4998-ba26-d993cbee31e4" (UID: "1c398f19-6cd0-4998-ba26-d993cbee31e4"). InnerVolumeSpecName "kube-api-access-p59mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.390638 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p59mt\" (UniqueName: \"kubernetes.io/projected/1c398f19-6cd0-4998-ba26-d993cbee31e4-kube-api-access-p59mt\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.390675 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv7ms\" (UniqueName: \"kubernetes.io/projected/dcf45b3a-a5e8-4958-86e7-734c6e3f9092-kube-api-access-fv7ms\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.566095 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9zst" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.571159 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.583737 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jpsqh" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.592431 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mjcmb" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693210 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wt5c\" (UniqueName: \"kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c\") pod \"c3930526-a0aa-4712-bb01-762d53e5cdba\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693362 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4dg\" (UniqueName: \"kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg\") pod \"29ced43a-3e30-4722-90ac-7b1184354703\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693543 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwktg\" (UniqueName: \"kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg\") pod \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693601 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts\") pod \"c3930526-a0aa-4712-bb01-762d53e5cdba\" (UID: \"c3930526-a0aa-4712-bb01-762d53e5cdba\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693626 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts\") pod \"29ced43a-3e30-4722-90ac-7b1184354703\" (UID: \"29ced43a-3e30-4722-90ac-7b1184354703\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.693674 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts\") pod \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\" (UID: \"4edeffd1-c6df-4546-8e0a-419f4ee657e8\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.694480 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4edeffd1-c6df-4546-8e0a-419f4ee657e8" (UID: "4edeffd1-c6df-4546-8e0a-419f4ee657e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.694483 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29ced43a-3e30-4722-90ac-7b1184354703" (UID: "29ced43a-3e30-4722-90ac-7b1184354703"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.694758 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3930526-a0aa-4712-bb01-762d53e5cdba" (UID: "c3930526-a0aa-4712-bb01-762d53e5cdba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.699648 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg" (OuterVolumeSpecName: "kube-api-access-kwktg") pod "4edeffd1-c6df-4546-8e0a-419f4ee657e8" (UID: "4edeffd1-c6df-4546-8e0a-419f4ee657e8"). InnerVolumeSpecName "kube-api-access-kwktg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.700202 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c" (OuterVolumeSpecName: "kube-api-access-2wt5c") pod "c3930526-a0aa-4712-bb01-762d53e5cdba" (UID: "c3930526-a0aa-4712-bb01-762d53e5cdba"). InnerVolumeSpecName "kube-api-access-2wt5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.700625 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg" (OuterVolumeSpecName: "kube-api-access-xf4dg") pod "29ced43a-3e30-4722-90ac-7b1184354703" (UID: "29ced43a-3e30-4722-90ac-7b1184354703"). InnerVolumeSpecName "kube-api-access-xf4dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.795700 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts\") pod \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.795774 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x8c9\" (UniqueName: \"kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9\") pod \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\" (UID: \"d3a23e91-45e1-4e8f-84b5-0787b2198d00\") " Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796579 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4dg\" (UniqueName: \"kubernetes.io/projected/29ced43a-3e30-4722-90ac-7b1184354703-kube-api-access-xf4dg\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796631 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3a23e91-45e1-4e8f-84b5-0787b2198d00" (UID: "d3a23e91-45e1-4e8f-84b5-0787b2198d00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796651 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwktg\" (UniqueName: \"kubernetes.io/projected/4edeffd1-c6df-4546-8e0a-419f4ee657e8-kube-api-access-kwktg\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796668 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3930526-a0aa-4712-bb01-762d53e5cdba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796683 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ced43a-3e30-4722-90ac-7b1184354703-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796695 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeffd1-c6df-4546-8e0a-419f4ee657e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.796759 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wt5c\" (UniqueName: \"kubernetes.io/projected/c3930526-a0aa-4712-bb01-762d53e5cdba-kube-api-access-2wt5c\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.799970 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9" (OuterVolumeSpecName: "kube-api-access-7x8c9") pod "d3a23e91-45e1-4e8f-84b5-0787b2198d00" (UID: "d3a23e91-45e1-4e8f-84b5-0787b2198d00"). InnerVolumeSpecName "kube-api-access-7x8c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.905606 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a23e91-45e1-4e8f-84b5-0787b2198d00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:14 crc kubenswrapper[4731]: I1203 19:12:14.905742 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x8c9\" (UniqueName: \"kubernetes.io/projected/d3a23e91-45e1-4e8f-84b5-0787b2198d00-kube-api-access-7x8c9\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.232668 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6755-account-create-update-cc66g" event={"ID":"4edeffd1-c6df-4546-8e0a-419f4ee657e8","Type":"ContainerDied","Data":"37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.232944 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37929d22122b54f5cfc7023a223ebdcbfb0175fdeff44638f90eb573fc712195" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.233010 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6755-account-create-update-cc66g" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.241759 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jpsqh" event={"ID":"c3930526-a0aa-4712-bb01-762d53e5cdba","Type":"ContainerDied","Data":"54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.241809 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a72a32d126282263f75acdc8631708d4c3fc7d21326d98d9fe2ccdc44046ff" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.241893 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jpsqh" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.252867 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7rr9" event={"ID":"9705a938-570d-4016-be7e-60a1ed1ed1cc","Type":"ContainerStarted","Data":"cb6297e3c8a4863ce934262e1f438477f2bc278650540fa7a995eab685f234a0"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.264833 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"8bfecf2030d27a4b918f928727e91cabd220023a032f5fb66bf3b11e88f4d056"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.265565 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"77b54d5dadc01aed528fca94a9c54eb9fc940a9a47f3c3a223fac7adcb0faab4"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.265673 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"4a5dbc36f3fc48cb7cabe878bcbf67ecfca78bba1d39bd956a578a7eb03c2cdb"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.272743 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mjcmb" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.274418 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mjcmb" event={"ID":"d3a23e91-45e1-4e8f-84b5-0787b2198d00","Type":"ContainerDied","Data":"4fce46d4086ec9c5d33477e36677a52387f7e9ef80c80f04b2e6e4268f14b697"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.274466 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fce46d4086ec9c5d33477e36677a52387f7e9ef80c80f04b2e6e4268f14b697" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.277396 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-69a9-account-create-update-b4j6g" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.277463 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9zst" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.277995 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9zst" event={"ID":"29ced43a-3e30-4722-90ac-7b1184354703","Type":"ContainerDied","Data":"4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c"} Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.278036 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6c5cd0b49e853fa0f09bf4f5d4ad0695652a1fa90d325a351db4ab24398c4c" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.278082 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ab2e-account-create-update-h8wtr" Dec 03 19:12:15 crc kubenswrapper[4731]: I1203 19:12:15.282303 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c7rr9" podStartSLOduration=14.957222389 podStartE2EDuration="21.282229447s" podCreationTimestamp="2025-12-03 19:11:54 +0000 UTC" firstStartedPulling="2025-12-03 19:12:08.24839101 +0000 UTC m=+1048.846985474" lastFinishedPulling="2025-12-03 19:12:14.573398068 +0000 UTC m=+1055.171992532" observedRunningTime="2025-12-03 19:12:15.274768847 +0000 UTC m=+1055.873363311" watchObservedRunningTime="2025-12-03 19:12:15.282229447 +0000 UTC m=+1055.880823921" Dec 03 19:12:16 crc kubenswrapper[4731]: E1203 19:12:16.045462 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4edeffd1_c6df_4546_8e0a_419f4ee657e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3930526_a0aa_4712_bb01_762d53e5cdba.slice\": RecentStats: unable to find data in memory cache]" Dec 03 19:12:16 crc kubenswrapper[4731]: I1203 19:12:16.296325 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"32a2d86cb231ead2e85c62c80042a6c9a7753c35bd97eb9ab4e538b4193b9f71"} Dec 03 19:12:18 crc kubenswrapper[4731]: I1203 19:12:18.428362 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"0d61f914ecd197ba3500989ea5bf920abcb370efd586a9258255959e1b1ab061"} Dec 03 19:12:18 crc kubenswrapper[4731]: I1203 19:12:18.428904 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"04f7783b51d8d3fbda627bc4bb000d6d477352b10c91a50b65e4fe941c2e5604"} Dec 03 19:12:18 crc kubenswrapper[4731]: I1203 19:12:18.428920 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"cd6fd388d09f9edadcabc1e43cac152b0737ee21bafc6e9c93db91e3eb6e7b32"} Dec 03 19:12:19 crc kubenswrapper[4731]: I1203 19:12:19.441980 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"0f9ec68fad45d12807b791638fc00b8e4902c6b637314a7ced44eaa16b58fee5"} Dec 03 19:12:19 crc kubenswrapper[4731]: I1203 19:12:19.443491 4731 generic.go:334] "Generic (PLEG): container finished" podID="9705a938-570d-4016-be7e-60a1ed1ed1cc" containerID="cb6297e3c8a4863ce934262e1f438477f2bc278650540fa7a995eab685f234a0" exitCode=0 Dec 03 19:12:19 crc kubenswrapper[4731]: I1203 19:12:19.443579 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7rr9" event={"ID":"9705a938-570d-4016-be7e-60a1ed1ed1cc","Type":"ContainerDied","Data":"cb6297e3c8a4863ce934262e1f438477f2bc278650540fa7a995eab685f234a0"} Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.760566 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.877801 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwh4w\" (UniqueName: \"kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w\") pod \"9705a938-570d-4016-be7e-60a1ed1ed1cc\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.878034 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data\") pod \"9705a938-570d-4016-be7e-60a1ed1ed1cc\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.878135 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle\") pod \"9705a938-570d-4016-be7e-60a1ed1ed1cc\" (UID: \"9705a938-570d-4016-be7e-60a1ed1ed1cc\") " Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.885729 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w" (OuterVolumeSpecName: "kube-api-access-wwh4w") pod "9705a938-570d-4016-be7e-60a1ed1ed1cc" (UID: "9705a938-570d-4016-be7e-60a1ed1ed1cc"). InnerVolumeSpecName "kube-api-access-wwh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.910032 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9705a938-570d-4016-be7e-60a1ed1ed1cc" (UID: "9705a938-570d-4016-be7e-60a1ed1ed1cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.926265 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data" (OuterVolumeSpecName: "config-data") pod "9705a938-570d-4016-be7e-60a1ed1ed1cc" (UID: "9705a938-570d-4016-be7e-60a1ed1ed1cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.980680 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.980721 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwh4w\" (UniqueName: \"kubernetes.io/projected/9705a938-570d-4016-be7e-60a1ed1ed1cc-kube-api-access-wwh4w\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:20 crc kubenswrapper[4731]: I1203 19:12:20.980734 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9705a938-570d-4016-be7e-60a1ed1ed1cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.462021 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7rr9" event={"ID":"9705a938-570d-4016-be7e-60a1ed1ed1cc","Type":"ContainerDied","Data":"b4a7d2a822edf1ab628dab0d4898cf8f8c3efd89692e3883117332f8dc0dcc93"} Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.462062 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4a7d2a822edf1ab628dab0d4898cf8f8c3efd89692e3883117332f8dc0dcc93" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.462067 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7rr9" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.753699 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n2v5g"] Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754148 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf45b3a-a5e8-4958-86e7-734c6e3f9092" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754167 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf45b3a-a5e8-4958-86e7-734c6e3f9092" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754183 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c398f19-6cd0-4998-ba26-d993cbee31e4" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754191 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c398f19-6cd0-4998-ba26-d993cbee31e4" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754212 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ced43a-3e30-4722-90ac-7b1184354703" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754221 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ced43a-3e30-4722-90ac-7b1184354703" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754237 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9705a938-570d-4016-be7e-60a1ed1ed1cc" containerName="keystone-db-sync" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754245 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="9705a938-570d-4016-be7e-60a1ed1ed1cc" containerName="keystone-db-sync" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754923 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a23e91-45e1-4e8f-84b5-0787b2198d00" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754936 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a23e91-45e1-4e8f-84b5-0787b2198d00" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.754978 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3930526-a0aa-4712-bb01-762d53e5cdba" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.754987 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3930526-a0aa-4712-bb01-762d53e5cdba" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: E1203 19:12:21.755011 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edeffd1-c6df-4546-8e0a-419f4ee657e8" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755021 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edeffd1-c6df-4546-8e0a-419f4ee657e8" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755246 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf45b3a-a5e8-4958-86e7-734c6e3f9092" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755304 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ced43a-3e30-4722-90ac-7b1184354703" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755324 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3930526-a0aa-4712-bb01-762d53e5cdba" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755345 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="9705a938-570d-4016-be7e-60a1ed1ed1cc" containerName="keystone-db-sync" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755438 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c398f19-6cd0-4998-ba26-d993cbee31e4" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755453 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a23e91-45e1-4e8f-84b5-0787b2198d00" containerName="mariadb-database-create" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.755464 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edeffd1-c6df-4546-8e0a-419f4ee657e8" containerName="mariadb-account-create-update" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.764911 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.769123 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rlxdn" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.769409 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.769572 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.770304 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.779498 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.780363 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2v5g"] Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.904542 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.904971 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9g4q\" (UniqueName: \"kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.904996 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.905033 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.905079 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.905111 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.969808 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:21 crc kubenswrapper[4731]: I1203 19:12:21.992650 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.000055 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-d8bpz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.000361 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.001824 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008028 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008156 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008197 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008327 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008358 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9g4q\" (UniqueName: \"kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.008381 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.012368 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.016267 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.024572 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.049466 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.050376 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.052759 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.065162 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9g4q\" (UniqueName: \"kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q\") pod \"keystone-bootstrap-n2v5g\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.110727 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.112876 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.112952 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4h9\" (UniqueName: \"kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.112982 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.113026 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.113057 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.127399 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bhw9w"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.132723 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.139545 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.140006 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mqzw2" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.140236 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.189299 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bhw9w"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.210907 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.214707 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.214760 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.214887 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.214968 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4h9\" (UniqueName: \"kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.215000 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.215617 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.219033 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.229616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.230748 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.246409 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4h9\" (UniqueName: \"kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9\") pod \"horizon-b84f748f-qg75m\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.274351 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kr6lg"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.274414 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.275700 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.280811 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.280974 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.281156 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fjtjs" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.282722 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kr6lg"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.289710 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.291729 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.297464 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.310244 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tx58l"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.311720 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.314463 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lh84f" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.314763 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317008 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317185 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317335 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-schb6\" (UniqueName: \"kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317216 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tx58l"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317471 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.317576 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.318098 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.426759 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427549 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m478c\" (UniqueName: \"kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427608 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427664 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427686 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427720 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427756 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427805 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhl7\" (UniqueName: \"kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427843 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427869 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427888 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427928 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427948 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.427968 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274lv\" (UniqueName: \"kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.428010 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-schb6\" (UniqueName: \"kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.429593 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.429639 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.429740 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.436433 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.445742 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.447110 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.449215 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.457152 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.474660 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.476061 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-schb6\" (UniqueName: \"kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6\") pod \"cinder-db-sync-bhw9w\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.482364 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.510428 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.510481 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.532768 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.532857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.532892 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.532923 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.532998 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znhl7\" (UniqueName: \"kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533028 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533053 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533105 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533129 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274lv\" (UniqueName: \"kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533185 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.533230 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m478c\" (UniqueName: \"kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.536495 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.537313 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.537564 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.548051 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.553682 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.554108 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.560500 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.560918 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.562054 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"190cf61dd6d42c5c770df757642e3d1361998c0b72e3ad217a2511b2cec651f7"} Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.562082 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m478c\" (UniqueName: \"kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c\") pod \"barbican-db-sync-tx58l\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.565001 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhl7\" (UniqueName: \"kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7\") pod \"neutron-db-sync-kr6lg\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.571205 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274lv\" (UniqueName: \"kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv\") pod \"horizon-869c6db985-g6p8s\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.614790 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-28phz"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.616315 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.632804 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.632847 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2m8tl" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.632805 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.633118 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-28phz"] Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.634595 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.634630 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.634699 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqw9v\" (UniqueName: \"kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.634746 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.635120 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.635161 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.635202 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.675716 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.713069 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.739470 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.739548 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.740763 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.740936 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741048 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741073 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741116 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741151 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741301 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741346 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8gb\" (UniqueName: \"kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741369 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqw9v\" (UniqueName: \"kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741416 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.741968 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.742196 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.744816 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.746604 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.758787 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.762550 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.763544 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.766998 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqw9v\" (UniqueName: \"kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v\") pod \"ceilometer-0\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.772953 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tx58l" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.842605 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.842662 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.842719 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.842781 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.842802 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8gb\" (UniqueName: \"kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.843723 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.849399 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.849585 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.852275 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.862053 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:22 crc kubenswrapper[4731]: I1203 19:12:22.864820 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8gb\" (UniqueName: \"kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb\") pod \"placement-db-sync-28phz\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " pod="openstack/placement-db-sync-28phz" Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.040734 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-28phz" Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.128242 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2v5g"] Dec 03 19:12:23 crc kubenswrapper[4731]: W1203 19:12:23.238433 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44538ac3_e8f5_4de9_9a95_a45baee0306d.slice/crio-454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126 WatchSource:0}: Error finding container 454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126: Status 404 returned error can't find the container with id 454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126 Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.428698 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.576539 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2v5g" event={"ID":"44538ac3-e8f5-4de9-9a95-a45baee0306d","Type":"ContainerStarted","Data":"454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126"} Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.604434 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"11ba66c83de845319924aae7aa11f9a0441d81526fe4744ad3fb865f2ede4eb7"} Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.604480 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8327126f-a2f3-4b2d-a5b3-118bfa1f41ce","Type":"ContainerStarted","Data":"adef6737d871dec64e4e9ca963d8454ac1bb56c68f0f7377a7e7ae591d8211a5"} Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.657326 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=40.388833622 podStartE2EDuration="1m4.657300843s" podCreationTimestamp="2025-12-03 19:11:19 +0000 UTC" firstStartedPulling="2025-12-03 19:11:53.375435566 +0000 UTC m=+1033.974030030" lastFinishedPulling="2025-12-03 19:12:17.643902787 +0000 UTC m=+1058.242497251" observedRunningTime="2025-12-03 19:12:23.656573591 +0000 UTC m=+1064.255168065" watchObservedRunningTime="2025-12-03 19:12:23.657300843 +0000 UTC m=+1064.255895317" Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.952000 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tx58l"] Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.971130 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bhw9w"] Dec 03 19:12:23 crc kubenswrapper[4731]: I1203 19:12:23.994407 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kr6lg"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.015466 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.016915 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.020638 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.039598 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.069573 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.085110 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.085155 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.085185 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.085204 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.085233 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc28s\" (UniqueName: \"kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.087839 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-28phz"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.109579 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.187197 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.187283 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.187311 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.187335 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.187365 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc28s\" (UniqueName: \"kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.188475 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.188505 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.188481 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.188908 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.208974 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc28s\" (UniqueName: \"kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s\") pod \"dnsmasq-dns-759c88d79f-424sd\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.386366 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:24 crc kubenswrapper[4731]: W1203 19:12:24.391321 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod936695f7_54f4_4fa7_8373_75c84337ea1f.slice/crio-37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1 WatchSource:0}: Error finding container 37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1: Status 404 returned error can't find the container with id 37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1 Dec 03 19:12:24 crc kubenswrapper[4731]: W1203 19:12:24.393389 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af3fc14_8410_4706_957b_2f95a972c64e.slice/crio-f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5 WatchSource:0}: Error finding container f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5: Status 404 returned error can't find the container with id f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5 Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.616303 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.625139 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kr6lg" event={"ID":"5af3fc14-8410-4706-957b-2f95a972c64e","Type":"ContainerStarted","Data":"f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.633286 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-28phz" event={"ID":"74dfcf3b-850d-48ea-9188-297579680f01","Type":"ContainerStarted","Data":"8cdad720fe03671ba979dce2245fa68dbf23bd669d9839b1fc5eba1fcd6e6f0f"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.638790 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.644871 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tx58l" event={"ID":"936695f7-54f4-4fa7-8373-75c84337ea1f","Type":"ContainerStarted","Data":"37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.668658 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bhw9w" event={"ID":"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee","Type":"ContainerStarted","Data":"2b1b113070d62638219d159d56c00bae75a4182cd493b61e7a2237f69d5bc016"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.687615 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869c6db985-g6p8s" event={"ID":"c14214be-643d-4566-a67c-0cf22e9d65f2","Type":"ContainerStarted","Data":"0110396026afac3a01dd9ea0f3c80fa74a44c95241990e1779509d4626338c72"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.695149 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerStarted","Data":"26fb2e8078bb336508bc5a22fd0ab0c4b6642dbfb726f8e0f3404f3cce8117b4"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.706643 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.708742 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.709900 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b84f748f-qg75m" event={"ID":"835ae369-11e8-4c50-9757-c9a0196c977b","Type":"ContainerStarted","Data":"45a17fba3f6196831d96e676c1e69bd1f1c5b7fb96b7df94c95aa5dbbfb475a0"} Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.718108 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.814295 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.814410 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.814543 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbt2\" (UniqueName: \"kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.814586 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.814756 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.916812 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.916913 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.917071 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.917128 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.917184 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbt2\" (UniqueName: \"kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.918945 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.919765 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.919766 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.926310 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:24 crc kubenswrapper[4731]: I1203 19:12:24.939435 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbt2\" (UniqueName: \"kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2\") pod \"horizon-5987d74c57-mktwp\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.037214 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.105886 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:12:25 crc kubenswrapper[4731]: W1203 19:12:25.118549 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d0a493_a0ec_4cdf_a367_3a60aab1ccfa.slice/crio-9b5a8c680d72e088e67020368b830e55941ed567de223b2dd2e114b35f813457 WatchSource:0}: Error finding container 9b5a8c680d72e088e67020368b830e55941ed567de223b2dd2e114b35f813457: Status 404 returned error can't find the container with id 9b5a8c680d72e088e67020368b830e55941ed567de223b2dd2e114b35f813457 Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.524321 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.726797 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kr6lg" event={"ID":"5af3fc14-8410-4706-957b-2f95a972c64e","Type":"ContainerStarted","Data":"8ffb57c2f3b98c1838f99eeb36c978ef0d034e0d05c43179152a8e68fd6e5972"} Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.732972 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5987d74c57-mktwp" event={"ID":"0deb4651-a013-492a-ae40-938b1bc6ece4","Type":"ContainerStarted","Data":"772169fddd947c2f2fd6af0269ec6c3f8c5094b3714067b8efa1b2380ddca8ad"} Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.734140 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759c88d79f-424sd" event={"ID":"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa","Type":"ContainerStarted","Data":"9b5a8c680d72e088e67020368b830e55941ed567de223b2dd2e114b35f813457"} Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.753422 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kr6lg" podStartSLOduration=3.753394541 podStartE2EDuration="3.753394541s" podCreationTimestamp="2025-12-03 19:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:25.744782855 +0000 UTC m=+1066.343377319" watchObservedRunningTime="2025-12-03 19:12:25.753394541 +0000 UTC m=+1066.351989005" Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.764436 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s4dgb" event={"ID":"7357e9a7-ce03-47ff-a1a5-55b8d1280d31","Type":"ContainerStarted","Data":"539ddb5b9571670f65f466f759e585daf9a16e518c7bae9e77c50ec7d85a005d"} Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.785959 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2v5g" event={"ID":"44538ac3-e8f5-4de9-9a95-a45baee0306d","Type":"ContainerStarted","Data":"bfc7fb4df8b00d9f01dfea5ec173884742f3a1a439ee6fac163282b77b77ada9"} Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.803506 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s4dgb" podStartSLOduration=4.623206438 podStartE2EDuration="38.803473198s" podCreationTimestamp="2025-12-03 19:11:47 +0000 UTC" firstStartedPulling="2025-12-03 19:11:48.385559827 +0000 UTC m=+1028.984154291" lastFinishedPulling="2025-12-03 19:12:22.565826587 +0000 UTC m=+1063.164421051" observedRunningTime="2025-12-03 19:12:25.788903558 +0000 UTC m=+1066.387498022" watchObservedRunningTime="2025-12-03 19:12:25.803473198 +0000 UTC m=+1066.402067662" Dec 03 19:12:25 crc kubenswrapper[4731]: I1203 19:12:25.843451 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n2v5g" podStartSLOduration=4.843422402 podStartE2EDuration="4.843422402s" podCreationTimestamp="2025-12-03 19:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:25.817050477 +0000 UTC m=+1066.415644941" watchObservedRunningTime="2025-12-03 19:12:25.843422402 +0000 UTC m=+1066.442016866" Dec 03 19:12:26 crc kubenswrapper[4731]: I1203 19:12:26.468877 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:12:26 crc kubenswrapper[4731]: I1203 19:12:26.469142 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:12:26 crc kubenswrapper[4731]: I1203 19:12:26.812664 4731 generic.go:334] "Generic (PLEG): container finished" podID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerID="4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b" exitCode=0 Dec 03 19:12:26 crc kubenswrapper[4731]: I1203 19:12:26.812892 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759c88d79f-424sd" event={"ID":"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa","Type":"ContainerDied","Data":"4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b"} Dec 03 19:12:27 crc kubenswrapper[4731]: I1203 19:12:27.841437 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759c88d79f-424sd" event={"ID":"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa","Type":"ContainerStarted","Data":"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a"} Dec 03 19:12:27 crc kubenswrapper[4731]: I1203 19:12:27.842511 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:27 crc kubenswrapper[4731]: I1203 19:12:27.885274 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759c88d79f-424sd" podStartSLOduration=4.885227283 podStartE2EDuration="4.885227283s" podCreationTimestamp="2025-12-03 19:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:27.878854537 +0000 UTC m=+1068.477449001" watchObservedRunningTime="2025-12-03 19:12:27.885227283 +0000 UTC m=+1068.483821747" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.702199 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.738545 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.740043 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.746487 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.750196 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773090 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773131 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773163 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773242 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsbx\" (UniqueName: \"kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773293 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773337 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.773372 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.809820 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.831523 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7578458cb-br8st"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.835354 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.848910 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7578458cb-br8st"] Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.881737 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.881872 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.881906 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-scripts\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.881967 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-secret-key\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.881999 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882018 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882124 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-combined-ca-bundle\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882150 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882229 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a524772-1481-4781-9847-f3394664a2d3-logs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882292 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-tls-certs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882408 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-config-data\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882450 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsbx\" (UniqueName: \"kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882494 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fshz5\" (UniqueName: \"kubernetes.io/projected/5a524772-1481-4781-9847-f3394664a2d3-kube-api-access-fshz5\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.882518 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.883069 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.883178 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.883185 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.914973 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.916783 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.917067 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.923907 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsbx\" (UniqueName: \"kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx\") pod \"horizon-687f68f6b4-jvzgv\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.985067 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fshz5\" (UniqueName: \"kubernetes.io/projected/5a524772-1481-4781-9847-f3394664a2d3-kube-api-access-fshz5\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.985634 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-scripts\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.985743 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-secret-key\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.985839 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-combined-ca-bundle\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.985928 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a524772-1481-4781-9847-f3394664a2d3-logs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.986046 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-tls-certs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.986156 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-config-data\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.986624 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-scripts\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.986959 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a524772-1481-4781-9847-f3394664a2d3-logs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.987625 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a524772-1481-4781-9847-f3394664a2d3-config-data\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.990967 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-combined-ca-bundle\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:30 crc kubenswrapper[4731]: I1203 19:12:30.991042 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-secret-key\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:30.997098 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a524772-1481-4781-9847-f3394664a2d3-horizon-tls-certs\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:31.009192 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fshz5\" (UniqueName: \"kubernetes.io/projected/5a524772-1481-4781-9847-f3394664a2d3-kube-api-access-fshz5\") pod \"horizon-7578458cb-br8st\" (UID: \"5a524772-1481-4781-9847-f3394664a2d3\") " pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:31.078141 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:31.165922 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:31.902215 4731 generic.go:334] "Generic (PLEG): container finished" podID="44538ac3-e8f5-4de9-9a95-a45baee0306d" containerID="bfc7fb4df8b00d9f01dfea5ec173884742f3a1a439ee6fac163282b77b77ada9" exitCode=0 Dec 03 19:12:31 crc kubenswrapper[4731]: I1203 19:12:31.902291 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2v5g" event={"ID":"44538ac3-e8f5-4de9-9a95-a45baee0306d","Type":"ContainerDied","Data":"bfc7fb4df8b00d9f01dfea5ec173884742f3a1a439ee6fac163282b77b77ada9"} Dec 03 19:12:34 crc kubenswrapper[4731]: I1203 19:12:34.389367 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:12:34 crc kubenswrapper[4731]: I1203 19:12:34.466761 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:12:34 crc kubenswrapper[4731]: I1203 19:12:34.467339 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" containerID="cri-o://2b92f0683b098abbb47b060363eed7d671e07ab138a12ce302a1c66801c7f1ef" gracePeriod=10 Dec 03 19:12:34 crc kubenswrapper[4731]: I1203 19:12:34.929990 4731 generic.go:334] "Generic (PLEG): container finished" podID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerID="2b92f0683b098abbb47b060363eed7d671e07ab138a12ce302a1c66801c7f1ef" exitCode=0 Dec 03 19:12:34 crc kubenswrapper[4731]: I1203 19:12:34.930041 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" event={"ID":"4d11f484-2fc0-41d0-bac8-24eb2d301146","Type":"ContainerDied","Data":"2b92f0683b098abbb47b060363eed7d671e07ab138a12ce302a1c66801c7f1ef"} Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.040138 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.042082 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb8h59bh56fh5bbh7bh597h594h685h5d5h68fhdfh66h5d8h669h558hbfhf7h64bh68ch55bh677h77h54chcch5d8h64dh698h594h5b7hcfh85hddq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-274lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-869c6db985-g6p8s_openstack(c14214be-643d-4566-a67c-0cf22e9d65f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.049205 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-869c6db985-g6p8s" podUID="c14214be-643d-4566-a67c-0cf22e9d65f2" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.326304 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.472922 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.472988 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.473052 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.473080 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9g4q\" (UniqueName: \"kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.473123 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.473291 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys\") pod \"44538ac3-e8f5-4de9-9a95-a45baee0306d\" (UID: \"44538ac3-e8f5-4de9-9a95-a45baee0306d\") " Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.484489 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts" (OuterVolumeSpecName: "scripts") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.484527 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.484529 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q" (OuterVolumeSpecName: "kube-api-access-g9g4q") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "kube-api-access-g9g4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.499475 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.505120 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data" (OuterVolumeSpecName: "config-data") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.506956 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44538ac3-e8f5-4de9-9a95-a45baee0306d" (UID: "44538ac3-e8f5-4de9-9a95-a45baee0306d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.575814 4731 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.576108 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.576212 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.576315 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.576387 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9g4q\" (UniqueName: \"kubernetes.io/projected/44538ac3-e8f5-4de9-9a95-a45baee0306d-kube-api-access-g9g4q\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: I1203 19:12:41.576454 4731 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44538ac3-e8f5-4de9-9a95-a45baee0306d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.772180 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.772454 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m478c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tx58l_openstack(936695f7-54f4-4fa7-8373-75c84337ea1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:41 crc kubenswrapper[4731]: E1203 19:12:41.773715 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tx58l" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.015640 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2v5g" event={"ID":"44538ac3-e8f5-4de9-9a95-a45baee0306d","Type":"ContainerDied","Data":"454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126"} Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.016017 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454379a73665a72f9cf39e549c1851077526c09e34a5c8c2694bb7f22a325126" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.015776 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2v5g" Dec 03 19:12:42 crc kubenswrapper[4731]: E1203 19:12:42.017421 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-tx58l" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.422667 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n2v5g"] Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.430432 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n2v5g"] Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.523542 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qbnk6"] Dec 03 19:12:42 crc kubenswrapper[4731]: E1203 19:12:42.524045 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44538ac3-e8f5-4de9-9a95-a45baee0306d" containerName="keystone-bootstrap" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.524074 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="44538ac3-e8f5-4de9-9a95-a45baee0306d" containerName="keystone-bootstrap" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.524488 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="44538ac3-e8f5-4de9-9a95-a45baee0306d" containerName="keystone-bootstrap" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.525335 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.528099 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.528389 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.528595 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.532734 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.532828 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rlxdn" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.534980 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qbnk6"] Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.667331 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707369 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707631 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnt6p\" (UniqueName: \"kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707729 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707753 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707811 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.707879 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810021 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810378 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810432 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810488 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnt6p\" (UniqueName: \"kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810554 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.810569 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.815369 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.815839 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.816868 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.817230 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.817816 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.833053 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnt6p\" (UniqueName: \"kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p\") pod \"keystone-bootstrap-qbnk6\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:42 crc kubenswrapper[4731]: I1203 19:12:42.851160 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.029089 4731 generic.go:334] "Generic (PLEG): container finished" podID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" containerID="539ddb5b9571670f65f466f759e585daf9a16e518c7bae9e77c50ec7d85a005d" exitCode=0 Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.029153 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s4dgb" event={"ID":"7357e9a7-ce03-47ff-a1a5-55b8d1280d31","Type":"ContainerDied","Data":"539ddb5b9571670f65f466f759e585daf9a16e518c7bae9e77c50ec7d85a005d"} Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.545453 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.546141 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4x8gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-28phz_openstack(74dfcf3b-850d-48ea-9188-297579680f01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.548208 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-28phz" podUID="74dfcf3b-850d-48ea-9188-297579680f01" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.559498 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.559706 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbch54hch5d8h5ddh8dh577h664h584h56ch684h55ch68dh57ch5fch8fh5b6h9bh674h59bh695h669h67dh54bh694h575h5bdh5cchb6h597h547h558q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z4h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-b84f748f-qg75m_openstack(835ae369-11e8-4c50-9757-c9a0196c977b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.564087 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-b84f748f-qg75m" podUID="835ae369-11e8-4c50-9757-c9a0196c977b" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.574922 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.575080 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n545h655h5f5h5bch5d5hd6h58h596h76h564hcfh65dh99h597h78h64dh54bh5dfhc8h74h5bch698hc6h594h5c9h576h568hb6h59bh548h67bh646q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjbt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5987d74c57-mktwp_openstack(0deb4651-a013-492a-ae40-938b1bc6ece4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:43 crc kubenswrapper[4731]: E1203 19:12:43.577768 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5987d74c57-mktwp" podUID="0deb4651-a013-492a-ae40-938b1bc6ece4" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.630903 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.728149 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq4p9\" (UniqueName: \"kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9\") pod \"4d11f484-2fc0-41d0-bac8-24eb2d301146\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.728325 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config\") pod \"4d11f484-2fc0-41d0-bac8-24eb2d301146\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.728366 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb\") pod \"4d11f484-2fc0-41d0-bac8-24eb2d301146\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.728501 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb\") pod \"4d11f484-2fc0-41d0-bac8-24eb2d301146\" (UID: \"4d11f484-2fc0-41d0-bac8-24eb2d301146\") " Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.734580 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9" (OuterVolumeSpecName: "kube-api-access-zq4p9") pod "4d11f484-2fc0-41d0-bac8-24eb2d301146" (UID: "4d11f484-2fc0-41d0-bac8-24eb2d301146"). InnerVolumeSpecName "kube-api-access-zq4p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.775161 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d11f484-2fc0-41d0-bac8-24eb2d301146" (UID: "4d11f484-2fc0-41d0-bac8-24eb2d301146"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.782411 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d11f484-2fc0-41d0-bac8-24eb2d301146" (UID: "4d11f484-2fc0-41d0-bac8-24eb2d301146"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.784334 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config" (OuterVolumeSpecName: "config") pod "4d11f484-2fc0-41d0-bac8-24eb2d301146" (UID: "4d11f484-2fc0-41d0-bac8-24eb2d301146"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.830357 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.830391 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.830401 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d11f484-2fc0-41d0-bac8-24eb2d301146-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.830411 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq4p9\" (UniqueName: \"kubernetes.io/projected/4d11f484-2fc0-41d0-bac8-24eb2d301146-kube-api-access-zq4p9\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:43 crc kubenswrapper[4731]: I1203 19:12:43.868188 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44538ac3-e8f5-4de9-9a95-a45baee0306d" path="/var/lib/kubelet/pods/44538ac3-e8f5-4de9-9a95-a45baee0306d/volumes" Dec 03 19:12:44 crc kubenswrapper[4731]: I1203 19:12:44.040835 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" event={"ID":"4d11f484-2fc0-41d0-bac8-24eb2d301146","Type":"ContainerDied","Data":"72acb02cc94ed3f6ef8741f13994b5847cb8a9f4d7604f6534ae62bf5f4009ad"} Dec 03 19:12:44 crc kubenswrapper[4731]: I1203 19:12:44.040968 4731 scope.go:117] "RemoveContainer" containerID="2b92f0683b098abbb47b060363eed7d671e07ab138a12ce302a1c66801c7f1ef" Dec 03 19:12:44 crc kubenswrapper[4731]: I1203 19:12:44.041032 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" Dec 03 19:12:44 crc kubenswrapper[4731]: E1203 19:12:44.046282 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-28phz" podUID="74dfcf3b-850d-48ea-9188-297579680f01" Dec 03 19:12:44 crc kubenswrapper[4731]: I1203 19:12:44.111693 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:12:44 crc kubenswrapper[4731]: I1203 19:12:44.122002 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bc95b79f5-smj25"] Dec 03 19:12:45 crc kubenswrapper[4731]: I1203 19:12:45.868097 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" path="/var/lib/kubelet/pods/4d11f484-2fc0-41d0-bac8-24eb2d301146/volumes" Dec 03 19:12:47 crc kubenswrapper[4731]: E1203 19:12:47.054836 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af3fc14_8410_4706_957b_2f95a972c64e.slice/crio-8ffb57c2f3b98c1838f99eeb36c978ef0d034e0d05c43179152a8e68fd6e5972.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:12:47 crc kubenswrapper[4731]: I1203 19:12:47.084485 4731 generic.go:334] "Generic (PLEG): container finished" podID="5af3fc14-8410-4706-957b-2f95a972c64e" containerID="8ffb57c2f3b98c1838f99eeb36c978ef0d034e0d05c43179152a8e68fd6e5972" exitCode=0 Dec 03 19:12:47 crc kubenswrapper[4731]: I1203 19:12:47.084530 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kr6lg" event={"ID":"5af3fc14-8410-4706-957b-2f95a972c64e","Type":"ContainerDied","Data":"8ffb57c2f3b98c1838f99eeb36c978ef0d034e0d05c43179152a8e68fd6e5972"} Dec 03 19:12:47 crc kubenswrapper[4731]: I1203 19:12:47.668084 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bc95b79f5-smj25" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.142368 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.154955 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5987d74c57-mktwp" event={"ID":"0deb4651-a013-492a-ae40-938b1bc6ece4","Type":"ContainerDied","Data":"772169fddd947c2f2fd6af0269ec6c3f8c5094b3714067b8efa1b2380ddca8ad"} Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.155025 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="772169fddd947c2f2fd6af0269ec6c3f8c5094b3714067b8efa1b2380ddca8ad" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.156705 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.159967 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.160610 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-869c6db985-g6p8s" event={"ID":"c14214be-643d-4566-a67c-0cf22e9d65f2","Type":"ContainerDied","Data":"0110396026afac3a01dd9ea0f3c80fa74a44c95241990e1779509d4626338c72"} Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.160672 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-869c6db985-g6p8s" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.162236 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b84f748f-qg75m" event={"ID":"835ae369-11e8-4c50-9757-c9a0196c977b","Type":"ContainerDied","Data":"45a17fba3f6196831d96e676c1e69bd1f1c5b7fb96b7df94c95aa5dbbfb475a0"} Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.162292 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a17fba3f6196831d96e676c1e69bd1f1c5b7fb96b7df94c95aa5dbbfb475a0" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.169129 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s4dgb" event={"ID":"7357e9a7-ce03-47ff-a1a5-55b8d1280d31","Type":"ContainerDied","Data":"b9bcc0fe85bffef16879bb3bfadd22d05b032722e4af06f13cc0b9610f318c0d"} Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.169160 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9bcc0fe85bffef16879bb3bfadd22d05b032722e4af06f13cc0b9610f318c0d" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.174935 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kr6lg" event={"ID":"5af3fc14-8410-4706-957b-2f95a972c64e","Type":"ContainerDied","Data":"f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5"} Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.174971 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f99478a31439ed33bf0fdeb3ba08b29855561adbc73d882591aa876098e4a2b5" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.175057 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kr6lg" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.176379 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.188088 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s4dgb" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237439 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts\") pod \"c14214be-643d-4566-a67c-0cf22e9d65f2\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237524 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjbt2\" (UniqueName: \"kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2\") pod \"0deb4651-a013-492a-ae40-938b1bc6ece4\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237577 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config\") pod \"5af3fc14-8410-4706-957b-2f95a972c64e\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237628 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle\") pod \"5af3fc14-8410-4706-957b-2f95a972c64e\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237669 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znhl7\" (UniqueName: \"kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7\") pod \"5af3fc14-8410-4706-957b-2f95a972c64e\" (UID: \"5af3fc14-8410-4706-957b-2f95a972c64e\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237717 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts\") pod \"0deb4651-a013-492a-ae40-938b1bc6ece4\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237742 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs\") pod \"0deb4651-a013-492a-ae40-938b1bc6ece4\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237798 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key\") pod \"c14214be-643d-4566-a67c-0cf22e9d65f2\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.237866 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs\") pod \"c14214be-643d-4566-a67c-0cf22e9d65f2\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.238012 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data\") pod \"0deb4651-a013-492a-ae40-938b1bc6ece4\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.238107 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data\") pod \"c14214be-643d-4566-a67c-0cf22e9d65f2\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.238131 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key\") pod \"0deb4651-a013-492a-ae40-938b1bc6ece4\" (UID: \"0deb4651-a013-492a-ae40-938b1bc6ece4\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.238151 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274lv\" (UniqueName: \"kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv\") pod \"c14214be-643d-4566-a67c-0cf22e9d65f2\" (UID: \"c14214be-643d-4566-a67c-0cf22e9d65f2\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.239664 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts" (OuterVolumeSpecName: "scripts") pod "0deb4651-a013-492a-ae40-938b1bc6ece4" (UID: "0deb4651-a013-492a-ae40-938b1bc6ece4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.240704 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs" (OuterVolumeSpecName: "logs") pod "c14214be-643d-4566-a67c-0cf22e9d65f2" (UID: "c14214be-643d-4566-a67c-0cf22e9d65f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.240777 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts" (OuterVolumeSpecName: "scripts") pod "c14214be-643d-4566-a67c-0cf22e9d65f2" (UID: "c14214be-643d-4566-a67c-0cf22e9d65f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.244902 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs" (OuterVolumeSpecName: "logs") pod "0deb4651-a013-492a-ae40-938b1bc6ece4" (UID: "0deb4651-a013-492a-ae40-938b1bc6ece4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.246505 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data" (OuterVolumeSpecName: "config-data") pod "c14214be-643d-4566-a67c-0cf22e9d65f2" (UID: "c14214be-643d-4566-a67c-0cf22e9d65f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.246593 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data" (OuterVolumeSpecName: "config-data") pod "0deb4651-a013-492a-ae40-938b1bc6ece4" (UID: "0deb4651-a013-492a-ae40-938b1bc6ece4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250034 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv" (OuterVolumeSpecName: "kube-api-access-274lv") pod "c14214be-643d-4566-a67c-0cf22e9d65f2" (UID: "c14214be-643d-4566-a67c-0cf22e9d65f2"). InnerVolumeSpecName "kube-api-access-274lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250075 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c14214be-643d-4566-a67c-0cf22e9d65f2" (UID: "c14214be-643d-4566-a67c-0cf22e9d65f2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250569 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14214be-643d-4566-a67c-0cf22e9d65f2-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250627 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250655 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250684 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274lv\" (UniqueName: \"kubernetes.io/projected/c14214be-643d-4566-a67c-0cf22e9d65f2-kube-api-access-274lv\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250713 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c14214be-643d-4566-a67c-0cf22e9d65f2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250740 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0deb4651-a013-492a-ae40-938b1bc6ece4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250764 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deb4651-a013-492a-ae40-938b1bc6ece4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.250787 4731 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c14214be-643d-4566-a67c-0cf22e9d65f2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.253535 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0deb4651-a013-492a-ae40-938b1bc6ece4" (UID: "0deb4651-a013-492a-ae40-938b1bc6ece4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.255555 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7" (OuterVolumeSpecName: "kube-api-access-znhl7") pod "5af3fc14-8410-4706-957b-2f95a972c64e" (UID: "5af3fc14-8410-4706-957b-2f95a972c64e"). InnerVolumeSpecName "kube-api-access-znhl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.256917 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2" (OuterVolumeSpecName: "kube-api-access-vjbt2") pod "0deb4651-a013-492a-ae40-938b1bc6ece4" (UID: "0deb4651-a013-492a-ae40-938b1bc6ece4"). InnerVolumeSpecName "kube-api-access-vjbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.285341 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config" (OuterVolumeSpecName: "config") pod "5af3fc14-8410-4706-957b-2f95a972c64e" (UID: "5af3fc14-8410-4706-957b-2f95a972c64e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.304495 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5af3fc14-8410-4706-957b-2f95a972c64e" (UID: "5af3fc14-8410-4706-957b-2f95a972c64e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.351591 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data\") pod \"835ae369-11e8-4c50-9757-c9a0196c977b\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.351692 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data\") pod \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.351736 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs\") pod \"835ae369-11e8-4c50-9757-c9a0196c977b\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.351800 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle\") pod \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.351874 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf22b\" (UniqueName: \"kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b\") pod \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.352068 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts\") pod \"835ae369-11e8-4c50-9757-c9a0196c977b\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.352144 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data\") pod \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\" (UID: \"7357e9a7-ce03-47ff-a1a5-55b8d1280d31\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.352550 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs" (OuterVolumeSpecName: "logs") pod "835ae369-11e8-4c50-9757-c9a0196c977b" (UID: "835ae369-11e8-4c50-9757-c9a0196c977b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.353036 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data" (OuterVolumeSpecName: "config-data") pod "835ae369-11e8-4c50-9757-c9a0196c977b" (UID: "835ae369-11e8-4c50-9757-c9a0196c977b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.353052 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts" (OuterVolumeSpecName: "scripts") pod "835ae369-11e8-4c50-9757-c9a0196c977b" (UID: "835ae369-11e8-4c50-9757-c9a0196c977b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.353133 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z4h9\" (UniqueName: \"kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9\") pod \"835ae369-11e8-4c50-9757-c9a0196c977b\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.353770 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key\") pod \"835ae369-11e8-4c50-9757-c9a0196c977b\" (UID: \"835ae369-11e8-4c50-9757-c9a0196c977b\") " Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354380 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354398 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af3fc14-8410-4706-957b-2f95a972c64e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354413 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znhl7\" (UniqueName: \"kubernetes.io/projected/5af3fc14-8410-4706-957b-2f95a972c64e-kube-api-access-znhl7\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354426 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354439 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/835ae369-11e8-4c50-9757-c9a0196c977b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354452 4731 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0deb4651-a013-492a-ae40-938b1bc6ece4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354468 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/835ae369-11e8-4c50-9757-c9a0196c977b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.354553 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjbt2\" (UniqueName: \"kubernetes.io/projected/0deb4651-a013-492a-ae40-938b1bc6ece4-kube-api-access-vjbt2\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.357455 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9" (OuterVolumeSpecName: "kube-api-access-2z4h9") pod "835ae369-11e8-4c50-9757-c9a0196c977b" (UID: "835ae369-11e8-4c50-9757-c9a0196c977b"). InnerVolumeSpecName "kube-api-access-2z4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.357569 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7357e9a7-ce03-47ff-a1a5-55b8d1280d31" (UID: "7357e9a7-ce03-47ff-a1a5-55b8d1280d31"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.358352 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b" (OuterVolumeSpecName: "kube-api-access-gf22b") pod "7357e9a7-ce03-47ff-a1a5-55b8d1280d31" (UID: "7357e9a7-ce03-47ff-a1a5-55b8d1280d31"). InnerVolumeSpecName "kube-api-access-gf22b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.360366 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "835ae369-11e8-4c50-9757-c9a0196c977b" (UID: "835ae369-11e8-4c50-9757-c9a0196c977b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.377289 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7357e9a7-ce03-47ff-a1a5-55b8d1280d31" (UID: "7357e9a7-ce03-47ff-a1a5-55b8d1280d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.414482 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data" (OuterVolumeSpecName: "config-data") pod "7357e9a7-ce03-47ff-a1a5-55b8d1280d31" (UID: "7357e9a7-ce03-47ff-a1a5-55b8d1280d31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455847 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455882 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf22b\" (UniqueName: \"kubernetes.io/projected/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-kube-api-access-gf22b\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455894 4731 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455903 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z4h9\" (UniqueName: \"kubernetes.io/projected/835ae369-11e8-4c50-9757-c9a0196c977b-kube-api-access-2z4h9\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455913 4731 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/835ae369-11e8-4c50-9757-c9a0196c977b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.455924 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7357e9a7-ce03-47ff-a1a5-55b8d1280d31-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.556619 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.564371 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-869c6db985-g6p8s"] Dec 03 19:12:53 crc kubenswrapper[4731]: I1203 19:12:53.875332 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14214be-643d-4566-a67c-0cf22e9d65f2" path="/var/lib/kubelet/pods/c14214be-643d-4566-a67c-0cf22e9d65f2/volumes" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.182730 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5987d74c57-mktwp" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.182810 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s4dgb" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.183397 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b84f748f-qg75m" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.268540 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.298353 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b84f748f-qg75m"] Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.308779 4731 scope.go:117] "RemoveContainer" containerID="0cd55b53525e0f5d95d18acabfaf6ca0895b829ad75fd24de303502ce7b5f380" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.334225 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.345243 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5987d74c57-mktwp"] Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.403861 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.404036 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-schb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bhw9w_openstack(50dcbd31-ca7a-47bc-831f-e5f5e2da78ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.405635 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bhw9w" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.472934 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.473705 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="init" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473724 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="init" Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.473738 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af3fc14-8410-4706-957b-2f95a972c64e" containerName="neutron-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473745 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af3fc14-8410-4706-957b-2f95a972c64e" containerName="neutron-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.473754 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" containerName="glance-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473761 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" containerName="glance-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: E1203 19:12:54.473783 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473789 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473945 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" containerName="glance-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473961 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d11f484-2fc0-41d0-bac8-24eb2d301146" containerName="dnsmasq-dns" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.473969 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af3fc14-8410-4706-957b-2f95a972c64e" containerName="neutron-db-sync" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.475073 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.479909 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.480224 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fjtjs" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.480401 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.480630 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.518376 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.586932 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.587022 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsqhv\" (UniqueName: \"kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.587114 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.587161 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.588105 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.689860 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.689954 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsqhv\" (UniqueName: \"kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.690045 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.690095 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.690143 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.700676 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.701903 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.703346 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.705815 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.711093 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsqhv\" (UniqueName: \"kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv\") pod \"neutron-84487d7f96-f2jmd\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.827768 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.951812 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7578458cb-br8st"] Dec 03 19:12:54 crc kubenswrapper[4731]: W1203 19:12:54.963221 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a524772_1481_4781_9847_f3394664a2d3.slice/crio-3a1846766df8c79d48caf1d075f619042fa6a109d5cd0a15f06ac1935d9fee17 WatchSource:0}: Error finding container 3a1846766df8c79d48caf1d075f619042fa6a109d5cd0a15f06ac1935d9fee17: Status 404 returned error can't find the container with id 3a1846766df8c79d48caf1d075f619042fa6a109d5cd0a15f06ac1935d9fee17 Dec 03 19:12:54 crc kubenswrapper[4731]: I1203 19:12:54.968349 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.100089 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qbnk6"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.225493 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7578458cb-br8st" event={"ID":"5a524772-1481-4781-9847-f3394664a2d3","Type":"ContainerStarted","Data":"3a1846766df8c79d48caf1d075f619042fa6a109d5cd0a15f06ac1935d9fee17"} Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.262205 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qbnk6" event={"ID":"eb136717-d762-4d86-8b8b-e3a59cadb469","Type":"ContainerStarted","Data":"75c6c6867bffa7799356daa7e91de7015287bb4cd634b6ba72dde34030b602ed"} Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.265357 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerStarted","Data":"82c25f4498741178874aec37b41cdd665e3584ee8f0d119e87a79e877938ac42"} Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.266599 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerStarted","Data":"434eff705dcb8799587de37641c4bff976c05a97f48dd016c9b71498964c1dd3"} Dec 03 19:12:55 crc kubenswrapper[4731]: E1203 19:12:55.311717 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-bhw9w" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.663501 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.666175 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.675196 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.675778 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.675983 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-txrr7" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.707031 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800707 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800764 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800811 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800851 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800901 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800926 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.800947 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqrt\" (UniqueName: \"kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.843151 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.874755 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0deb4651-a013-492a-ae40-938b1bc6ece4" path="/var/lib/kubelet/pods/0deb4651-a013-492a-ae40-938b1bc6ece4/volumes" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.875443 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835ae369-11e8-4c50-9757-c9a0196c977b" path="/var/lib/kubelet/pods/835ae369-11e8-4c50-9757-c9a0196c977b/volumes" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.899505 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.901578 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.905692 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.913523 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.913659 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.913782 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.913899 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.914073 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.914143 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.914199 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqrt\" (UniqueName: \"kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.915172 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.915442 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.918029 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.918564 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.932643 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.935616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.939785 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:12:55 crc kubenswrapper[4731]: I1203 19:12:55.943535 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqrt\" (UniqueName: \"kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.021639 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.022067 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.022422 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.022539 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.022821 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.022967 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.023073 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbczb\" (UniqueName: \"kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.061695 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " pod="openstack/glance-default-external-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124781 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124835 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124892 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124920 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124964 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbczb\" (UniqueName: \"kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.124993 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.125487 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.129407 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.129516 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.132220 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.139003 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.144893 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.148528 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbczb\" (UniqueName: \"kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.166280 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.244891 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.281297 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerStarted","Data":"cbf678eabf96eed394c8395472bdf57b49dfe6cd318626870fa3106aed7d89b5"} Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.283099 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7578458cb-br8st" event={"ID":"5a524772-1481-4781-9847-f3394664a2d3","Type":"ContainerStarted","Data":"741769476b3c35dd318d89d58745037f29fd350469d9dc5cd48e9d415af0a897"} Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.284498 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qbnk6" event={"ID":"eb136717-d762-4d86-8b8b-e3a59cadb469","Type":"ContainerStarted","Data":"88104e63aace8073210f9d0e6d5e70acec74c4ff0f6f2fbdb25d75f8c0503dba"} Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.287054 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerStarted","Data":"eaeb841a4e38e245deb30c52e6f7317e04c0043137c1be520774a226d25dbde6"} Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.307593 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qbnk6" podStartSLOduration=14.307566837 podStartE2EDuration="14.307566837s" podCreationTimestamp="2025-12-03 19:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:56.30638642 +0000 UTC m=+1096.904980894" watchObservedRunningTime="2025-12-03 19:12:56.307566837 +0000 UTC m=+1096.906161301" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.310997 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.468346 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:12:56 crc kubenswrapper[4731]: I1203 19:12:56.468873 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.254206 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.333842 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerStarted","Data":"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.334165 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerStarted","Data":"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.334218 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.408066 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerStarted","Data":"63adc4f83dfd0bb907c09157ac5dff2459d352e37b79e5232f645c356cd8e229"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.435891 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84487d7f96-f2jmd" podStartSLOduration=3.435854609 podStartE2EDuration="3.435854609s" podCreationTimestamp="2025-12-03 19:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:57.406027648 +0000 UTC m=+1098.004622122" watchObservedRunningTime="2025-12-03 19:12:57.435854609 +0000 UTC m=+1098.034449073" Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.480926 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerStarted","Data":"c799c4598778d6cb537fbff91743aa1254ea976f68d020b92903ed42ca10d4d1"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.489958 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerStarted","Data":"b824719360d5c598d6d63177940c2fff30207b4b666481a901664d9b35d6dd38"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.505080 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7578458cb-br8st" event={"ID":"5a524772-1481-4781-9847-f3394664a2d3","Type":"ContainerStarted","Data":"b94a7fc489c40b9b7a2d5f3293247d67217a05076e9c25cc1bdb23dddeda5008"} Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.536046 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-687f68f6b4-jvzgv" podStartSLOduration=26.728482875 podStartE2EDuration="27.536018172s" podCreationTimestamp="2025-12-03 19:12:30 +0000 UTC" firstStartedPulling="2025-12-03 19:12:54.981165047 +0000 UTC m=+1095.579759511" lastFinishedPulling="2025-12-03 19:12:55.788700344 +0000 UTC m=+1096.387294808" observedRunningTime="2025-12-03 19:12:57.525118326 +0000 UTC m=+1098.123712800" watchObservedRunningTime="2025-12-03 19:12:57.536018172 +0000 UTC m=+1098.134612626" Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.572339 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7578458cb-br8st" podStartSLOduration=26.765645132 podStartE2EDuration="27.572312813s" podCreationTimestamp="2025-12-03 19:12:30 +0000 UTC" firstStartedPulling="2025-12-03 19:12:54.981159017 +0000 UTC m=+1095.579753481" lastFinishedPulling="2025-12-03 19:12:55.787826698 +0000 UTC m=+1096.386421162" observedRunningTime="2025-12-03 19:12:57.566661828 +0000 UTC m=+1098.165256312" watchObservedRunningTime="2025-12-03 19:12:57.572312813 +0000 UTC m=+1098.170907277" Dec 03 19:12:57 crc kubenswrapper[4731]: I1203 19:12:57.626524 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:12:58 crc kubenswrapper[4731]: I1203 19:12:58.534001 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tx58l" event={"ID":"936695f7-54f4-4fa7-8373-75c84337ea1f","Type":"ContainerStarted","Data":"06a27cf623ef61e09a573f659ea97ad6f23d0841eb0e339d059db48eff9ed51c"} Dec 03 19:12:58 crc kubenswrapper[4731]: I1203 19:12:58.580312 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tx58l" podStartSLOduration=3.505421413 podStartE2EDuration="36.580280169s" podCreationTimestamp="2025-12-03 19:12:22 +0000 UTC" firstStartedPulling="2025-12-03 19:12:24.403562988 +0000 UTC m=+1065.002157462" lastFinishedPulling="2025-12-03 19:12:57.478421764 +0000 UTC m=+1098.077016218" observedRunningTime="2025-12-03 19:12:58.554947737 +0000 UTC m=+1099.153542201" watchObservedRunningTime="2025-12-03 19:12:58.580280169 +0000 UTC m=+1099.178874633" Dec 03 19:12:58 crc kubenswrapper[4731]: I1203 19:12:58.585661 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerStarted","Data":"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce"} Dec 03 19:12:58 crc kubenswrapper[4731]: I1203 19:12:58.588328 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerStarted","Data":"f359ab010842eccb587fb42e8305c31a18189bf2b6e3b6a63f25b2675634b416"} Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.034791 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.146927 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.604027 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerStarted","Data":"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214"} Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.604492 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerStarted","Data":"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986"} Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.604560 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-log" containerID="cri-o://ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" gracePeriod=30 Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.604714 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-httpd" containerID="cri-o://d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" gracePeriod=30 Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.620997 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-log" containerID="cri-o://b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" gracePeriod=30 Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.621166 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-httpd" containerID="cri-o://cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" gracePeriod=30 Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.621423 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerStarted","Data":"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513"} Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.629301 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.629281613 podStartE2EDuration="5.629281613s" podCreationTimestamp="2025-12-03 19:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:59.624933559 +0000 UTC m=+1100.223528023" watchObservedRunningTime="2025-12-03 19:12:59.629281613 +0000 UTC m=+1100.227876077" Dec 03 19:12:59 crc kubenswrapper[4731]: I1203 19:12:59.662597 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.662567811 podStartE2EDuration="5.662567811s" podCreationTimestamp="2025-12-03 19:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:12:59.660105634 +0000 UTC m=+1100.258700098" watchObservedRunningTime="2025-12-03 19:12:59.662567811 +0000 UTC m=+1100.261162275" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.486064 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.612195 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbczb\" (UniqueName: \"kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613200 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613360 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613433 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613559 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613676 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613785 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.613921 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data\") pod \"8a6190db-33ea-459c-a450-c7d9fd3493c8\" (UID: \"8a6190db-33ea-459c-a450-c7d9fd3493c8\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.614437 4731 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.623824 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb" (OuterVolumeSpecName: "kube-api-access-bbczb") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "kube-api-access-bbczb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.624971 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs" (OuterVolumeSpecName: "logs") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.641273 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.649769 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.673601 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.683481 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts" (OuterVolumeSpecName: "scripts") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.688329 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-748fc65857-2r69r"] Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.689020 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.689150 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.689248 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.689343 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.689422 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.689508 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.689601 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.689688 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.689989 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.690078 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-httpd" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.690149 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.690220 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerName="glance-log" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.691484 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.701757 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.704224 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.713347 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-28phz" event={"ID":"74dfcf3b-850d-48ea-9188-297579680f01","Type":"ContainerStarted","Data":"c5a7e7747a6e165d39c33b622b7b07df305d27612d046d6e8b4a19635eb321bc"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717538 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717614 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717656 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717707 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717783 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.717885 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqrt\" (UniqueName: \"kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.718004 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts\") pod \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\" (UID: \"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa\") " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.718460 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.718582 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs" (OuterVolumeSpecName: "logs") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719037 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719069 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719081 4731 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719093 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719105 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbczb\" (UniqueName: \"kubernetes.io/projected/8a6190db-33ea-459c-a450-c7d9fd3493c8-kube-api-access-bbczb\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719117 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.719126 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a6190db-33ea-459c-a450-c7d9fd3493c8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.736082 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts" (OuterVolumeSpecName: "scripts") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.739698 4731 generic.go:334] "Generic (PLEG): container finished" podID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerID="d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" exitCode=143 Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.739946 4731 generic.go:334] "Generic (PLEG): container finished" podID="8a6190db-33ea-459c-a450-c7d9fd3493c8" containerID="ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" exitCode=143 Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.740386 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerDied","Data":"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.740520 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerDied","Data":"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.740593 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a6190db-33ea-459c-a450-c7d9fd3493c8","Type":"ContainerDied","Data":"f359ab010842eccb587fb42e8305c31a18189bf2b6e3b6a63f25b2675634b416"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.740712 4731 scope.go:117] "RemoveContainer" containerID="d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.741023 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.742943 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.744347 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data" (OuterVolumeSpecName: "config-data") pod "8a6190db-33ea-459c-a450-c7d9fd3493c8" (UID: "8a6190db-33ea-459c-a450-c7d9fd3493c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.750081 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748fc65857-2r69r"] Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.751753 4731 generic.go:334] "Generic (PLEG): container finished" podID="eb136717-d762-4d86-8b8b-e3a59cadb469" containerID="88104e63aace8073210f9d0e6d5e70acec74c4ff0f6f2fbdb25d75f8c0503dba" exitCode=0 Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.752553 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qbnk6" event={"ID":"eb136717-d762-4d86-8b8b-e3a59cadb469","Type":"ContainerDied","Data":"88104e63aace8073210f9d0e6d5e70acec74c4ff0f6f2fbdb25d75f8c0503dba"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.752843 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt" (OuterVolumeSpecName: "kube-api-access-nwqrt") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "kube-api-access-nwqrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.762319 4731 generic.go:334] "Generic (PLEG): container finished" podID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerID="cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" exitCode=143 Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.762829 4731 generic.go:334] "Generic (PLEG): container finished" podID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" containerID="b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" exitCode=143 Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.762948 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerDied","Data":"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.763441 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerDied","Data":"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.763537 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dff35b8-0657-4112-b5d9-3d09b6c8ffaa","Type":"ContainerDied","Data":"63adc4f83dfd0bb907c09157ac5dff2459d352e37b79e5232f645c356cd8e229"} Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.763695 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.775335 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.781817 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-28phz" podStartSLOduration=3.560602097 podStartE2EDuration="38.781790523s" podCreationTimestamp="2025-12-03 19:12:22 +0000 UTC" firstStartedPulling="2025-12-03 19:12:24.413455883 +0000 UTC m=+1065.012050347" lastFinishedPulling="2025-12-03 19:12:59.634644309 +0000 UTC m=+1100.233238773" observedRunningTime="2025-12-03 19:13:00.756081229 +0000 UTC m=+1101.354675693" watchObservedRunningTime="2025-12-03 19:13:00.781790523 +0000 UTC m=+1101.380384997" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.828726 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data" (OuterVolumeSpecName: "config-data") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.829738 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-httpd-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.829867 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-internal-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.829965 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-public-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830058 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-combined-ca-bundle\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830163 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlg7\" (UniqueName: \"kubernetes.io/projected/74e9f605-1edd-4f8e-af56-133041b4c068-kube-api-access-7tlg7\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830202 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-ovndb-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830541 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830636 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830652 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6190db-33ea-459c-a450-c7d9fd3493c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830664 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqrt\" (UniqueName: \"kubernetes.io/projected/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-kube-api-access-nwqrt\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830677 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830687 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.830709 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.839312 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" (UID: "0dff35b8-0657-4112-b5d9-3d09b6c8ffaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.849340 4731 scope.go:117] "RemoveContainer" containerID="ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.864345 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.900843 4731 scope.go:117] "RemoveContainer" containerID="d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.901557 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214\": container with ID starting with d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214 not found: ID does not exist" containerID="d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.901621 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214"} err="failed to get container status \"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214\": rpc error: code = NotFound desc = could not find container \"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214\": container with ID starting with d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.901644 4731 scope.go:117] "RemoveContainer" containerID="ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.901989 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986\": container with ID starting with ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986 not found: ID does not exist" containerID="ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.902033 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986"} err="failed to get container status \"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986\": rpc error: code = NotFound desc = could not find container \"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986\": container with ID starting with ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.902052 4731 scope.go:117] "RemoveContainer" containerID="d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.903049 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214"} err="failed to get container status \"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214\": rpc error: code = NotFound desc = could not find container \"d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214\": container with ID starting with d88c226db305bc7b3ddfeb8b122207516a2d66223bf358d09b63e5f0a9011214 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.903087 4731 scope.go:117] "RemoveContainer" containerID="ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.905607 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986"} err="failed to get container status \"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986\": rpc error: code = NotFound desc = could not find container \"ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986\": container with ID starting with ecdab9ec295a9427ac1e6c3398e59c7e64d90005384f32715aef84b153c22986 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.905657 4731 scope.go:117] "RemoveContainer" containerID="cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.932977 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlg7\" (UniqueName: \"kubernetes.io/projected/74e9f605-1edd-4f8e-af56-133041b4c068-kube-api-access-7tlg7\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.933507 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-ovndb-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.934446 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.934568 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-internal-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.934609 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-httpd-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.936309 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-public-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.936472 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-combined-ca-bundle\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.936700 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.936711 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.939232 4731 scope.go:117] "RemoveContainer" containerID="b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.940069 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-ovndb-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.941058 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-public-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.942649 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.944090 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-combined-ca-bundle\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.944203 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-httpd-config\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.945149 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e9f605-1edd-4f8e-af56-133041b4c068-internal-tls-certs\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.958610 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlg7\" (UniqueName: \"kubernetes.io/projected/74e9f605-1edd-4f8e-af56-133041b4c068-kube-api-access-7tlg7\") pod \"neutron-748fc65857-2r69r\" (UID: \"74e9f605-1edd-4f8e-af56-133041b4c068\") " pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.976288 4731 scope.go:117] "RemoveContainer" containerID="cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.977867 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513\": container with ID starting with cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513 not found: ID does not exist" containerID="cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.977903 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513"} err="failed to get container status \"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513\": rpc error: code = NotFound desc = could not find container \"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513\": container with ID starting with cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.977927 4731 scope.go:117] "RemoveContainer" containerID="b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" Dec 03 19:13:00 crc kubenswrapper[4731]: E1203 19:13:00.978667 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce\": container with ID starting with b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce not found: ID does not exist" containerID="b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.978688 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce"} err="failed to get container status \"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce\": rpc error: code = NotFound desc = could not find container \"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce\": container with ID starting with b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.978701 4731 scope.go:117] "RemoveContainer" containerID="cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.979313 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513"} err="failed to get container status \"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513\": rpc error: code = NotFound desc = could not find container \"cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513\": container with ID starting with cb6dcc0e471920e134e3387121da031fcef04324aeca531410cd038fae9dd513 not found: ID does not exist" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.979334 4731 scope.go:117] "RemoveContainer" containerID="b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce" Dec 03 19:13:00 crc kubenswrapper[4731]: I1203 19:13:00.979581 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce"} err="failed to get container status \"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce\": rpc error: code = NotFound desc = could not find container \"b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce\": container with ID starting with b8411fbea3943b36392358bde028605c4797cbff527cde2856cbf74d51a72dce not found: ID does not exist" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.054492 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.078323 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.078679 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.169408 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.170349 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.228024 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.236615 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.285186 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.286999 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.293730 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.295753 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.295874 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-txrr7" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.296478 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.298783 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.311732 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.328325 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.364340 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.366127 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.367407 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.367486 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.371436 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5d5f\" (UniqueName: \"kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.371623 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.371772 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.371850 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.371899 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.372072 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.382112 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.384231 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.384450 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.530025 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.530842 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.530985 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531078 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531191 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5d5f\" (UniqueName: \"kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531389 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531530 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531623 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531718 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531829 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.531916 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.532084 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.532178 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskb2\" (UniqueName: \"kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.533672 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.533923 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.534601 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.548479 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.557928 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.572611 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.583228 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5d5f\" (UniqueName: \"kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.532304 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.590570 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.590701 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.602279 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.609848 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702740 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702791 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702847 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702868 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskb2\" (UniqueName: \"kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702890 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702920 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702938 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.702970 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.703420 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.714016 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.714413 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.714644 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.713921 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.739319 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.740709 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskb2\" (UniqueName: \"kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.741231 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.837418 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.879726 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dff35b8-0657-4112-b5d9-3d09b6c8ffaa" path="/var/lib/kubelet/pods/0dff35b8-0657-4112-b5d9-3d09b6c8ffaa/volumes" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.881465 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6190db-33ea-459c-a450-c7d9fd3493c8" path="/var/lib/kubelet/pods/8a6190db-33ea-459c-a450-c7d9fd3493c8/volumes" Dec 03 19:13:01 crc kubenswrapper[4731]: I1203 19:13:01.914873 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.024519 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.102955 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748fc65857-2r69r"] Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.370028 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.522438 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.523030 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.523066 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.524224 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.524492 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.524653 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnt6p\" (UniqueName: \"kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p\") pod \"eb136717-d762-4d86-8b8b-e3a59cadb469\" (UID: \"eb136717-d762-4d86-8b8b-e3a59cadb469\") " Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.531800 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.541580 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts" (OuterVolumeSpecName: "scripts") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.541943 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.544785 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p" (OuterVolumeSpecName: "kube-api-access-mnt6p") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "kube-api-access-mnt6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.568596 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data" (OuterVolumeSpecName: "config-data") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.612476 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb136717-d762-4d86-8b8b-e3a59cadb469" (UID: "eb136717-d762-4d86-8b8b-e3a59cadb469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627004 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627042 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627055 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627072 4731 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627082 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnt6p\" (UniqueName: \"kubernetes.io/projected/eb136717-d762-4d86-8b8b-e3a59cadb469-kube-api-access-mnt6p\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.627091 4731 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb136717-d762-4d86-8b8b-e3a59cadb469-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.817908 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.872831 4731 generic.go:334] "Generic (PLEG): container finished" podID="936695f7-54f4-4fa7-8373-75c84337ea1f" containerID="06a27cf623ef61e09a573f659ea97ad6f23d0841eb0e339d059db48eff9ed51c" exitCode=0 Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.872900 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tx58l" event={"ID":"936695f7-54f4-4fa7-8373-75c84337ea1f","Type":"ContainerDied","Data":"06a27cf623ef61e09a573f659ea97ad6f23d0841eb0e339d059db48eff9ed51c"} Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.875086 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qbnk6" event={"ID":"eb136717-d762-4d86-8b8b-e3a59cadb469","Type":"ContainerDied","Data":"75c6c6867bffa7799356daa7e91de7015287bb4cd634b6ba72dde34030b602ed"} Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.875114 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75c6c6867bffa7799356daa7e91de7015287bb4cd634b6ba72dde34030b602ed" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.875156 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qbnk6" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.876674 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748fc65857-2r69r" event={"ID":"74e9f605-1edd-4f8e-af56-133041b4c068","Type":"ContainerStarted","Data":"c7c9eb4f36151367b70a4153e57f56fb61f765ee24d92d7999cafb2da9943a7b"} Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.876698 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748fc65857-2r69r" event={"ID":"74e9f605-1edd-4f8e-af56-133041b4c068","Type":"ContainerStarted","Data":"5252504233e05d5cfab8ed668cc3fa034b6d627cfafeb1d61caf39b73e6df911"} Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.956265 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-787778d6bb-5glsd"] Dec 03 19:13:02 crc kubenswrapper[4731]: E1203 19:13:02.957227 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb136717-d762-4d86-8b8b-e3a59cadb469" containerName="keystone-bootstrap" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.957422 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb136717-d762-4d86-8b8b-e3a59cadb469" containerName="keystone-bootstrap" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.957850 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb136717-d762-4d86-8b8b-e3a59cadb469" containerName="keystone-bootstrap" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.958877 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.965818 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.966198 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.966390 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.966560 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rlxdn" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.966721 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.966899 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 19:13:02 crc kubenswrapper[4731]: I1203 19:13:02.975020 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-787778d6bb-5glsd"] Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.067632 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149077 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7khk\" (UniqueName: \"kubernetes.io/projected/87313be0-fc9a-4f6b-a50c-1c2adc167dad-kube-api-access-h7khk\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149162 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-public-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149192 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-scripts\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149241 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-config-data\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149485 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-fernet-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149629 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-combined-ca-bundle\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149652 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-internal-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.149679 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-credential-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.251833 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-combined-ca-bundle\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.251895 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-internal-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.251923 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-credential-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.251993 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7khk\" (UniqueName: \"kubernetes.io/projected/87313be0-fc9a-4f6b-a50c-1c2adc167dad-kube-api-access-h7khk\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.252020 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-public-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.252037 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-scripts\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.252075 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-config-data\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.252116 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-fernet-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.273203 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-public-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.273286 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-fernet-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.273342 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-config-data\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.273948 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-credential-keys\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.274846 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-internal-tls-certs\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.279120 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-combined-ca-bundle\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.279922 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87313be0-fc9a-4f6b-a50c-1c2adc167dad-scripts\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.304103 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7khk\" (UniqueName: \"kubernetes.io/projected/87313be0-fc9a-4f6b-a50c-1c2adc167dad-kube-api-access-h7khk\") pod \"keystone-787778d6bb-5glsd\" (UID: \"87313be0-fc9a-4f6b-a50c-1c2adc167dad\") " pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:03 crc kubenswrapper[4731]: I1203 19:13:03.596775 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:05 crc kubenswrapper[4731]: I1203 19:13:05.915931 4731 generic.go:334] "Generic (PLEG): container finished" podID="74dfcf3b-850d-48ea-9188-297579680f01" containerID="c5a7e7747a6e165d39c33b622b7b07df305d27612d046d6e8b4a19635eb321bc" exitCode=0 Dec 03 19:13:05 crc kubenswrapper[4731]: I1203 19:13:05.916008 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-28phz" event={"ID":"74dfcf3b-850d-48ea-9188-297579680f01","Type":"ContainerDied","Data":"c5a7e7747a6e165d39c33b622b7b07df305d27612d046d6e8b4a19635eb321bc"} Dec 03 19:13:07 crc kubenswrapper[4731]: W1203 19:13:07.097327 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c7ff16_cffb_40a4_909c_6c4bca6598a3.slice/crio-1427351a8bca3db6a24f12bacb21dcbf92cda5bb84e59b0d8d21afc63d686501 WatchSource:0}: Error finding container 1427351a8bca3db6a24f12bacb21dcbf92cda5bb84e59b0d8d21afc63d686501: Status 404 returned error can't find the container with id 1427351a8bca3db6a24f12bacb21dcbf92cda5bb84e59b0d8d21afc63d686501 Dec 03 19:13:07 crc kubenswrapper[4731]: I1203 19:13:07.272220 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tx58l" Dec 03 19:13:07 crc kubenswrapper[4731]: I1203 19:13:07.414484 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle\") pod \"936695f7-54f4-4fa7-8373-75c84337ea1f\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " Dec 03 19:13:07 crc kubenswrapper[4731]: I1203 19:13:07.414780 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m478c\" (UniqueName: \"kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c\") pod \"936695f7-54f4-4fa7-8373-75c84337ea1f\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " Dec 03 19:13:07 crc kubenswrapper[4731]: I1203 19:13:07.414883 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data\") pod \"936695f7-54f4-4fa7-8373-75c84337ea1f\" (UID: \"936695f7-54f4-4fa7-8373-75c84337ea1f\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.422206 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "936695f7-54f4-4fa7-8373-75c84337ea1f" (UID: "936695f7-54f4-4fa7-8373-75c84337ea1f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.423805 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c" (OuterVolumeSpecName: "kube-api-access-m478c") pod "936695f7-54f4-4fa7-8373-75c84337ea1f" (UID: "936695f7-54f4-4fa7-8373-75c84337ea1f"). InnerVolumeSpecName "kube-api-access-m478c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.466435 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-28phz" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.469992 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "936695f7-54f4-4fa7-8373-75c84337ea1f" (UID: "936695f7-54f4-4fa7-8373-75c84337ea1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.516949 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.516979 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m478c\" (UniqueName: \"kubernetes.io/projected/936695f7-54f4-4fa7-8373-75c84337ea1f-kube-api-access-m478c\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.516990 4731 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/936695f7-54f4-4fa7-8373-75c84337ea1f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.618656 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs\") pod \"74dfcf3b-850d-48ea-9188-297579680f01\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.618776 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8gb\" (UniqueName: \"kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb\") pod \"74dfcf3b-850d-48ea-9188-297579680f01\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.618867 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data\") pod \"74dfcf3b-850d-48ea-9188-297579680f01\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.618960 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts\") pod \"74dfcf3b-850d-48ea-9188-297579680f01\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.619086 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle\") pod \"74dfcf3b-850d-48ea-9188-297579680f01\" (UID: \"74dfcf3b-850d-48ea-9188-297579680f01\") " Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.619326 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs" (OuterVolumeSpecName: "logs") pod "74dfcf3b-850d-48ea-9188-297579680f01" (UID: "74dfcf3b-850d-48ea-9188-297579680f01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.619995 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74dfcf3b-850d-48ea-9188-297579680f01-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.623144 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb" (OuterVolumeSpecName: "kube-api-access-4x8gb") pod "74dfcf3b-850d-48ea-9188-297579680f01" (UID: "74dfcf3b-850d-48ea-9188-297579680f01"). InnerVolumeSpecName "kube-api-access-4x8gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.623700 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts" (OuterVolumeSpecName: "scripts") pod "74dfcf3b-850d-48ea-9188-297579680f01" (UID: "74dfcf3b-850d-48ea-9188-297579680f01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.653441 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74dfcf3b-850d-48ea-9188-297579680f01" (UID: "74dfcf3b-850d-48ea-9188-297579680f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.673667 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data" (OuterVolumeSpecName: "config-data") pod "74dfcf3b-850d-48ea-9188-297579680f01" (UID: "74dfcf3b-850d-48ea-9188-297579680f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.721883 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.721917 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8gb\" (UniqueName: \"kubernetes.io/projected/74dfcf3b-850d-48ea-9188-297579680f01-kube-api-access-4x8gb\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.721930 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:07.721940 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74dfcf3b-850d-48ea-9188-297579680f01-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.010279 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tx58l" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.011839 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tx58l" event={"ID":"936695f7-54f4-4fa7-8373-75c84337ea1f","Type":"ContainerDied","Data":"37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.011921 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ce14231c9fc945e9742968726f1b3a097747084e5ca372fd48350a5ba33ef1" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.017888 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerStarted","Data":"49f664c93260fd8752114706cba86b7a3fed72541df9731bb81d119e0a1bd02e"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.021496 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-28phz" event={"ID":"74dfcf3b-850d-48ea-9188-297579680f01","Type":"ContainerDied","Data":"8cdad720fe03671ba979dce2245fa68dbf23bd669d9839b1fc5eba1fcd6e6f0f"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.021535 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cdad720fe03671ba979dce2245fa68dbf23bd669d9839b1fc5eba1fcd6e6f0f" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.021674 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-28phz" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.049878 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748fc65857-2r69r" event={"ID":"74e9f605-1edd-4f8e-af56-133041b4c068","Type":"ContainerStarted","Data":"bb5d5dd665e614678a40d75d2105a912835db2777a816a0c51508c087e71247b"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.050079 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.084508 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f7d595d8d-wt4nd"] Dec 03 19:13:08 crc kubenswrapper[4731]: E1203 19:13:08.085006 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74dfcf3b-850d-48ea-9188-297579680f01" containerName="placement-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.085023 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="74dfcf3b-850d-48ea-9188-297579680f01" containerName="placement-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: E1203 19:13:08.085060 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" containerName="barbican-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.085067 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" containerName="barbican-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.085366 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" containerName="barbican-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.085384 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="74dfcf3b-850d-48ea-9188-297579680f01" containerName="placement-db-sync" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.089043 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.094802 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-748fc65857-2r69r" podStartSLOduration=8.0947791 podStartE2EDuration="8.0947791s" podCreationTimestamp="2025-12-03 19:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:08.088809352 +0000 UTC m=+1108.687403816" watchObservedRunningTime="2025-12-03 19:13:08.0947791 +0000 UTC m=+1108.693373564" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.098012 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2m8tl" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.098688 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerStarted","Data":"33662cd7ba2522f0d0ce52ce5d222e87eabfd59d29dac43f44181809555b2c02"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.113381 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.113973 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.114741 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.138780 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerStarted","Data":"1427351a8bca3db6a24f12bacb21dcbf92cda5bb84e59b0d8d21afc63d686501"} Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.144040 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.181667 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f7d595d8d-wt4nd"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243147 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9981e783-5ae8-488c-85ec-2b06327f324c-logs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243277 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h665\" (UniqueName: \"kubernetes.io/projected/9981e783-5ae8-488c-85ec-2b06327f324c-kube-api-access-4h665\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243339 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-scripts\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243410 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-combined-ca-bundle\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243437 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-internal-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243466 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-public-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.243643 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-config-data\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348482 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9981e783-5ae8-488c-85ec-2b06327f324c-logs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348556 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h665\" (UniqueName: \"kubernetes.io/projected/9981e783-5ae8-488c-85ec-2b06327f324c-kube-api-access-4h665\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348597 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-scripts\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348646 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-combined-ca-bundle\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348675 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-internal-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348695 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-public-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.348760 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-config-data\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.353152 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9981e783-5ae8-488c-85ec-2b06327f324c-logs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.357082 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-config-data\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.357984 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-scripts\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.366737 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-internal-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.383976 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-public-tls-certs\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.386032 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9981e783-5ae8-488c-85ec-2b06327f324c-combined-ca-bundle\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.390087 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h665\" (UniqueName: \"kubernetes.io/projected/9981e783-5ae8-488c-85ec-2b06327f324c-kube-api-access-4h665\") pod \"placement-6f7d595d8d-wt4nd\" (UID: \"9981e783-5ae8-488c-85ec-2b06327f324c\") " pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.505790 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.590163 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-86cfb76fcc-q9qnm"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.594585 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.599500 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.599939 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.600184 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lh84f" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.606124 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.608373 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.612214 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.637209 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86cfb76fcc-q9qnm"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.656839 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768778 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4bzj\" (UniqueName: \"kubernetes.io/projected/2757b374-1ad9-404d-89a6-a033996ac07c-kube-api-access-x4bzj\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768836 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data-custom\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768865 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768882 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-combined-ca-bundle\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768932 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-combined-ca-bundle\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768949 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2757b374-1ad9-404d-89a6-a033996ac07c-logs\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768967 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4cdl\" (UniqueName: \"kubernetes.io/projected/895114ae-e1ff-4386-8d43-1c7a2a9f2867-kube-api-access-z4cdl\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.768985 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data-custom\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.769032 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.769063 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895114ae-e1ff-4386-8d43-1c7a2a9f2867-logs\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.806767 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.808866 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.818692 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.822560 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.870923 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4bzj\" (UniqueName: \"kubernetes.io/projected/2757b374-1ad9-404d-89a6-a033996ac07c-kube-api-access-x4bzj\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873288 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data-custom\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873359 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873378 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-combined-ca-bundle\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873464 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-combined-ca-bundle\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873509 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2757b374-1ad9-404d-89a6-a033996ac07c-logs\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873534 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cdl\" (UniqueName: \"kubernetes.io/projected/895114ae-e1ff-4386-8d43-1c7a2a9f2867-kube-api-access-z4cdl\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873558 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data-custom\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873623 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.873659 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895114ae-e1ff-4386-8d43-1c7a2a9f2867-logs\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.874109 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895114ae-e1ff-4386-8d43-1c7a2a9f2867-logs\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.877767 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2757b374-1ad9-404d-89a6-a033996ac07c-logs\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.889951 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-combined-ca-bundle\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.893650 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.895341 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-config-data-custom\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.895487 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.899059 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4bzj\" (UniqueName: \"kubernetes.io/projected/2757b374-1ad9-404d-89a6-a033996ac07c-kube-api-access-x4bzj\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.899269 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2757b374-1ad9-404d-89a6-a033996ac07c-combined-ca-bundle\") pod \"barbican-worker-86cfb76fcc-q9qnm\" (UID: \"2757b374-1ad9-404d-89a6-a033996ac07c\") " pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.899799 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895114ae-e1ff-4386-8d43-1c7a2a9f2867-config-data-custom\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.902761 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cdl\" (UniqueName: \"kubernetes.io/projected/895114ae-e1ff-4386-8d43-1c7a2a9f2867-kube-api-access-z4cdl\") pod \"barbican-keystone-listener-76b8bcd5f6-jzp2r\" (UID: \"895114ae-e1ff-4386-8d43-1c7a2a9f2867\") " pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.940394 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.961209 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.975346 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.976482 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.976513 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.977220 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:08 crc kubenswrapper[4731]: I1203 19:13:08.977297 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnrm\" (UniqueName: \"kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.084480 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.084532 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.084678 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.084710 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnrm\" (UniqueName: \"kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.084776 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.089851 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.090834 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.100392 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.101022 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.108982 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-787778d6bb-5glsd"] Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.127836 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnrm\" (UniqueName: \"kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm\") pod \"barbican-api-77f589d474-ffcg5\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.196355 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.233446 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787778d6bb-5glsd" event={"ID":"87313be0-fc9a-4f6b-a50c-1c2adc167dad","Type":"ContainerStarted","Data":"48ed32ac188e35dd45855b2ad5c5aab6bdc28127552c983cc4755f8c52cd22a0"} Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.236074 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerStarted","Data":"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d"} Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.242064 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerStarted","Data":"5fc8cd185db9f6cc74a3980df1d6bcbee6f797152e81faf499063ed07fa23458"} Dec 03 19:13:09 crc kubenswrapper[4731]: W1203 19:13:09.313224 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9981e783_5ae8_488c_85ec_2b06327f324c.slice/crio-5a7065b7481635cfb2ff786810cf3cc1c1d4471ddc441276d7298931c77e5545 WatchSource:0}: Error finding container 5a7065b7481635cfb2ff786810cf3cc1c1d4471ddc441276d7298931c77e5545: Status 404 returned error can't find the container with id 5a7065b7481635cfb2ff786810cf3cc1c1d4471ddc441276d7298931c77e5545 Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.324643 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f7d595d8d-wt4nd"] Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.658085 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86cfb76fcc-q9qnm"] Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.913900 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:09 crc kubenswrapper[4731]: I1203 19:13:09.948766 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r"] Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.257633 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-787778d6bb-5glsd" event={"ID":"87313be0-fc9a-4f6b-a50c-1c2adc167dad","Type":"ContainerStarted","Data":"a6ceacf84c3a04448021ada6ff9f99011dbee80fbad005cb967ad82a9bf913c6"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.258125 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.259980 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerStarted","Data":"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.266746 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerStarted","Data":"643ccac1e98aca3f257e5dbd1e4dde0ff10a837f1f03cfb0f8ebaa0fb2f9a406"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.281221 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bhw9w" event={"ID":"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee","Type":"ContainerStarted","Data":"3543322e0433654a37eee878cab3b4bbbc3fa0af2fe9c9a0972efffd1f2eae76"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.285480 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" event={"ID":"2757b374-1ad9-404d-89a6-a033996ac07c","Type":"ContainerStarted","Data":"9bc4ea10ec5948440e86b7e1813d190d061cdc0bf46f28f0caa25f482bcd6301"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.298885 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-787778d6bb-5glsd" podStartSLOduration=8.298861512 podStartE2EDuration="8.298861512s" podCreationTimestamp="2025-12-03 19:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:10.293752012 +0000 UTC m=+1110.892346476" watchObservedRunningTime="2025-12-03 19:13:10.298861512 +0000 UTC m=+1110.897455976" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.299666 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerStarted","Data":"f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.313904 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" event={"ID":"895114ae-e1ff-4386-8d43-1c7a2a9f2867","Type":"ContainerStarted","Data":"7e43ec4f341f729fece7184c953f21648c332526163ba27ee4c852709eebbe7e"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.318492 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f7d595d8d-wt4nd" event={"ID":"9981e783-5ae8-488c-85ec-2b06327f324c","Type":"ContainerStarted","Data":"ab018d76f13e6e1b4b28eb4d216f4b3916bbeeafe6d692a427feae53afb87114"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.318526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f7d595d8d-wt4nd" event={"ID":"9981e783-5ae8-488c-85ec-2b06327f324c","Type":"ContainerStarted","Data":"1f417e64d548f3f46b0edd0e3c51086673d8c7dca3b13571a158e05da5194b17"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.318536 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f7d595d8d-wt4nd" event={"ID":"9981e783-5ae8-488c-85ec-2b06327f324c","Type":"ContainerStarted","Data":"5a7065b7481635cfb2ff786810cf3cc1c1d4471ddc441276d7298931c77e5545"} Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.319424 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.319448 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.328133 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bhw9w" podStartSLOduration=5.820714548 podStartE2EDuration="49.328114239s" podCreationTimestamp="2025-12-03 19:12:21 +0000 UTC" firstStartedPulling="2025-12-03 19:12:24.402962389 +0000 UTC m=+1065.001556853" lastFinishedPulling="2025-12-03 19:13:07.91036208 +0000 UTC m=+1108.508956544" observedRunningTime="2025-12-03 19:13:10.324546677 +0000 UTC m=+1110.923141141" watchObservedRunningTime="2025-12-03 19:13:10.328114239 +0000 UTC m=+1110.926708703" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.350793 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.350770939 podStartE2EDuration="9.350770939s" podCreationTimestamp="2025-12-03 19:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:10.348442147 +0000 UTC m=+1110.947036631" watchObservedRunningTime="2025-12-03 19:13:10.350770939 +0000 UTC m=+1110.949365403" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.383486 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f7d595d8d-wt4nd" podStartSLOduration=2.383462983 podStartE2EDuration="2.383462983s" podCreationTimestamp="2025-12-03 19:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:10.381369388 +0000 UTC m=+1110.979963882" watchObservedRunningTime="2025-12-03 19:13:10.383462983 +0000 UTC m=+1110.982057447" Dec 03 19:13:10 crc kubenswrapper[4731]: I1203 19:13:10.408362 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.408340653 podStartE2EDuration="9.408340653s" podCreationTimestamp="2025-12-03 19:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:10.404153192 +0000 UTC m=+1111.002747656" watchObservedRunningTime="2025-12-03 19:13:10.408340653 +0000 UTC m=+1111.006935117" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.082643 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.176491 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7578458cb-br8st" podUID="5a524772-1481-4781-9847-f3394664a2d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.137:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.137:8443: connect: connection refused" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.760999 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55489cbdc4-8kvp2"] Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.790781 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55489cbdc4-8kvp2"] Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.790932 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.811652 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.812144 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.897731 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898129 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60084d2a-1621-420c-80ac-fb38a0eae005-logs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898174 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data-custom\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898210 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-public-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898238 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-internal-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898306 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwc7\" (UniqueName: \"kubernetes.io/projected/60084d2a-1621-420c-80ac-fb38a0eae005-kube-api-access-nkwc7\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.898363 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-combined-ca-bundle\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.919658 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.919988 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.953884 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 19:13:11 crc kubenswrapper[4731]: I1203 19:13:11.986353 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.001996 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-public-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.002206 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-internal-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.004555 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwc7\" (UniqueName: \"kubernetes.io/projected/60084d2a-1621-420c-80ac-fb38a0eae005-kube-api-access-nkwc7\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.004954 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-combined-ca-bundle\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.005273 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.005408 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60084d2a-1621-420c-80ac-fb38a0eae005-logs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.005595 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data-custom\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.008147 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-public-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.008486 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60084d2a-1621-420c-80ac-fb38a0eae005-logs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.019276 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.019314 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-config-data-custom\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.022703 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-internal-tls-certs\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.024782 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.024929 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.026436 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60084d2a-1621-420c-80ac-fb38a0eae005-combined-ca-bundle\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.047322 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwc7\" (UniqueName: \"kubernetes.io/projected/60084d2a-1621-420c-80ac-fb38a0eae005-kube-api-access-nkwc7\") pod \"barbican-api-55489cbdc4-8kvp2\" (UID: \"60084d2a-1621-420c-80ac-fb38a0eae005\") " pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.068419 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.099569 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.286796 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.392944 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerStarted","Data":"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31"} Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.393022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerStarted","Data":"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c"} Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.393238 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.393284 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.393566 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.393613 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.830442 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77f589d474-ffcg5" podStartSLOduration=4.830419946 podStartE2EDuration="4.830419946s" podCreationTimestamp="2025-12-03 19:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:12.419058738 +0000 UTC m=+1113.017653202" watchObservedRunningTime="2025-12-03 19:13:12.830419946 +0000 UTC m=+1113.429014410" Dec 03 19:13:12 crc kubenswrapper[4731]: I1203 19:13:12.837551 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55489cbdc4-8kvp2"] Dec 03 19:13:13 crc kubenswrapper[4731]: I1203 19:13:13.406112 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:13 crc kubenswrapper[4731]: I1203 19:13:13.406441 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:14 crc kubenswrapper[4731]: I1203 19:13:14.416142 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55489cbdc4-8kvp2" event={"ID":"60084d2a-1621-420c-80ac-fb38a0eae005","Type":"ContainerStarted","Data":"17346e3866e5610934dfb3027bf97c1676eadcef95e64107b1cf242e487f8daf"} Dec 03 19:13:15 crc kubenswrapper[4731]: I1203 19:13:15.428389 4731 generic.go:334] "Generic (PLEG): container finished" podID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" containerID="3543322e0433654a37eee878cab3b4bbbc3fa0af2fe9c9a0972efffd1f2eae76" exitCode=0 Dec 03 19:13:15 crc kubenswrapper[4731]: I1203 19:13:15.428485 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bhw9w" event={"ID":"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee","Type":"ContainerDied","Data":"3543322e0433654a37eee878cab3b4bbbc3fa0af2fe9c9a0972efffd1f2eae76"} Dec 03 19:13:15 crc kubenswrapper[4731]: I1203 19:13:15.566012 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 19:13:15 crc kubenswrapper[4731]: I1203 19:13:15.584442 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:15 crc kubenswrapper[4731]: I1203 19:13:15.597158 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 19:13:17 crc kubenswrapper[4731]: I1203 19:13:17.452828 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" event={"ID":"2757b374-1ad9-404d-89a6-a033996ac07c","Type":"ContainerStarted","Data":"6b7e52ffe31a24cf2b18993eaddc11bf5140644ad756ab8074289327a3c5fa49"} Dec 03 19:13:17 crc kubenswrapper[4731]: I1203 19:13:17.455371 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55489cbdc4-8kvp2" event={"ID":"60084d2a-1621-420c-80ac-fb38a0eae005","Type":"ContainerStarted","Data":"dc69610a0467756c913e189b7a28f161421547b916f658437596234b89389a7f"} Dec 03 19:13:17 crc kubenswrapper[4731]: I1203 19:13:17.747973 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.443710 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.486398 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bhw9w" event={"ID":"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee","Type":"ContainerDied","Data":"2b1b113070d62638219d159d56c00bae75a4182cd493b61e7a2237f69d5bc016"} Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.486452 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1b113070d62638219d159d56c00bae75a4182cd493b61e7a2237f69d5bc016" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.486461 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bhw9w" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507242 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507398 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507434 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507467 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507656 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.507817 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-schb6\" (UniqueName: \"kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6\") pod \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\" (UID: \"50dcbd31-ca7a-47bc-831f-e5f5e2da78ee\") " Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.511086 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.522534 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts" (OuterVolumeSpecName: "scripts") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.522568 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.522706 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6" (OuterVolumeSpecName: "kube-api-access-schb6") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "kube-api-access-schb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.603414 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.610431 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.610537 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-schb6\" (UniqueName: \"kubernetes.io/projected/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-kube-api-access-schb6\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.610598 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.610665 4731 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.610753 4731 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.671032 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data" (OuterVolumeSpecName: "config-data") pod "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" (UID: "50dcbd31-ca7a-47bc-831f-e5f5e2da78ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:18 crc kubenswrapper[4731]: I1203 19:13:18.720852 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.496741 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" event={"ID":"2757b374-1ad9-404d-89a6-a033996ac07c","Type":"ContainerStarted","Data":"49c8867c1447c5b3fc8a974361e7cc3f2c44969e0ec9629f0365c41d71eea8e2"} Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.500185 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55489cbdc4-8kvp2" event={"ID":"60084d2a-1621-420c-80ac-fb38a0eae005","Type":"ContainerStarted","Data":"83a7f9ae81309289edc56d471bd42f173c92cc5b61467cf561ef9426c7509cce"} Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.500603 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.500909 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.503831 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerStarted","Data":"205bf3ac547ad1b6a8a2c2acbd2ac9ab4a77f693335ea5ed64330d7579d8ff10"} Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.503919 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-central-agent" containerID="cri-o://82c25f4498741178874aec37b41cdd665e3584ee8f0d119e87a79e877938ac42" gracePeriod=30 Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.504049 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.504302 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="sg-core" containerID="cri-o://33662cd7ba2522f0d0ce52ce5d222e87eabfd59d29dac43f44181809555b2c02" gracePeriod=30 Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.504984 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-notification-agent" containerID="cri-o://c799c4598778d6cb537fbff91743aa1254ea976f68d020b92903ed42ca10d4d1" gracePeriod=30 Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.505067 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="proxy-httpd" containerID="cri-o://205bf3ac547ad1b6a8a2c2acbd2ac9ab4a77f693335ea5ed64330d7579d8ff10" gracePeriod=30 Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.518974 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" event={"ID":"895114ae-e1ff-4386-8d43-1c7a2a9f2867","Type":"ContainerStarted","Data":"d5708fc0685ce932fbd78513827e861e110d9bc6941bf3be2693784040ef0518"} Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.519053 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" event={"ID":"895114ae-e1ff-4386-8d43-1c7a2a9f2867","Type":"ContainerStarted","Data":"35286a8432f9b056e21ed040225212e4dbc804622a313e1e12cff5a33831fb16"} Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.521999 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-86cfb76fcc-q9qnm" podStartSLOduration=7.652289961 podStartE2EDuration="11.521988613s" podCreationTimestamp="2025-12-03 19:13:08 +0000 UTC" firstStartedPulling="2025-12-03 19:13:09.679464785 +0000 UTC m=+1110.278059249" lastFinishedPulling="2025-12-03 19:13:13.549163447 +0000 UTC m=+1114.147757901" observedRunningTime="2025-12-03 19:13:19.52033255 +0000 UTC m=+1120.118927014" watchObservedRunningTime="2025-12-03 19:13:19.521988613 +0000 UTC m=+1120.120583077" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.549083 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55489cbdc4-8kvp2" podStartSLOduration=8.54904313 podStartE2EDuration="8.54904313s" podCreationTimestamp="2025-12-03 19:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:19.547032977 +0000 UTC m=+1120.145627441" watchObservedRunningTime="2025-12-03 19:13:19.54904313 +0000 UTC m=+1120.147637594" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.598549 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76b8bcd5f6-jzp2r" podStartSLOduration=8.033203674 podStartE2EDuration="11.598522411s" podCreationTimestamp="2025-12-03 19:13:08 +0000 UTC" firstStartedPulling="2025-12-03 19:13:09.974479778 +0000 UTC m=+1110.573074232" lastFinishedPulling="2025-12-03 19:13:13.539798505 +0000 UTC m=+1114.138392969" observedRunningTime="2025-12-03 19:13:19.596662202 +0000 UTC m=+1120.195256666" watchObservedRunningTime="2025-12-03 19:13:19.598522411 +0000 UTC m=+1120.197116865" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.606746 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.770744756 podStartE2EDuration="57.606720868s" podCreationTimestamp="2025-12-03 19:12:22 +0000 UTC" firstStartedPulling="2025-12-03 19:12:24.407132988 +0000 UTC m=+1065.005727452" lastFinishedPulling="2025-12-03 19:13:18.24310909 +0000 UTC m=+1118.841703564" observedRunningTime="2025-12-03 19:13:19.571663059 +0000 UTC m=+1120.170257523" watchObservedRunningTime="2025-12-03 19:13:19.606720868 +0000 UTC m=+1120.205315332" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.778910 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:19 crc kubenswrapper[4731]: E1203 19:13:19.794867 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" containerName="cinder-db-sync" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.794896 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" containerName="cinder-db-sync" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.795117 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" containerName="cinder-db-sync" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.796316 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.807876 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.808394 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.808610 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mqzw2" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.808996 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.822280 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849612 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849676 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849724 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849762 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849791 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhz8w\" (UniqueName: \"kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.849823 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.951827 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.952160 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.952285 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.952386 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.952479 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhz8w\" (UniqueName: \"kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.952560 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.960240 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.960753 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.962532 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.963434 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.964362 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:19 crc kubenswrapper[4731]: I1203 19:13:19.978073 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhz8w\" (UniqueName: \"kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w\") pod \"cinder-scheduler-0\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.057606 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.067217 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.072990 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.089374 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156629 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156680 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156701 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156786 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fbf\" (UniqueName: \"kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156805 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156824 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.156874 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.190877 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.258768 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.258857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.258886 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.258905 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.258985 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fbf\" (UniqueName: \"kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.259009 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.259029 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.259407 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.260852 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.264104 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.274510 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.275234 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.286278 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.287082 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fbf\" (UniqueName: \"kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf\") pod \"cinder-api-0\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.392993 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.561149 4731 generic.go:334] "Generic (PLEG): container finished" podID="1df5a151-2186-4252-b04e-148a055b6a9d" containerID="205bf3ac547ad1b6a8a2c2acbd2ac9ab4a77f693335ea5ed64330d7579d8ff10" exitCode=0 Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.561199 4731 generic.go:334] "Generic (PLEG): container finished" podID="1df5a151-2186-4252-b04e-148a055b6a9d" containerID="33662cd7ba2522f0d0ce52ce5d222e87eabfd59d29dac43f44181809555b2c02" exitCode=2 Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.562330 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerDied","Data":"205bf3ac547ad1b6a8a2c2acbd2ac9ab4a77f693335ea5ed64330d7579d8ff10"} Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.562398 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerDied","Data":"33662cd7ba2522f0d0ce52ce5d222e87eabfd59d29dac43f44181809555b2c02"} Dec 03 19:13:20 crc kubenswrapper[4731]: I1203 19:13:20.563121 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.063512 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.079356 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.167194 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7578458cb-br8st" podUID="5a524772-1481-4781-9847-f3394664a2d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.137:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.137:8443: connect: connection refused" Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.578843 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerStarted","Data":"60b41764c58b38aee9197d61ae2c3d496a2ed9f7c38c5b0ed033336d1cfb20e8"} Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.582391 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerStarted","Data":"c0e4b3df2a1952189bcb040528c72cd9fc09fb7694742dc1a26ade36094da95c"} Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.590776 4731 generic.go:334] "Generic (PLEG): container finished" podID="1df5a151-2186-4252-b04e-148a055b6a9d" containerID="82c25f4498741178874aec37b41cdd665e3584ee8f0d119e87a79e877938ac42" exitCode=0 Dec 03 19:13:21 crc kubenswrapper[4731]: I1203 19:13:21.590841 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerDied","Data":"82c25f4498741178874aec37b41cdd665e3584ee8f0d119e87a79e877938ac42"} Dec 03 19:13:22 crc kubenswrapper[4731]: I1203 19:13:22.057554 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:22 crc kubenswrapper[4731]: I1203 19:13:22.570946 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:22 crc kubenswrapper[4731]: I1203 19:13:22.638102 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerStarted","Data":"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4"} Dec 03 19:13:22 crc kubenswrapper[4731]: I1203 19:13:22.821322 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.661434 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerStarted","Data":"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1"} Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.694077 4731 generic.go:334] "Generic (PLEG): container finished" podID="1df5a151-2186-4252-b04e-148a055b6a9d" containerID="c799c4598778d6cb537fbff91743aa1254ea976f68d020b92903ed42ca10d4d1" exitCode=0 Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.694448 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerDied","Data":"c799c4598778d6cb537fbff91743aa1254ea976f68d020b92903ed42ca10d4d1"} Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.694595 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1df5a151-2186-4252-b04e-148a055b6a9d","Type":"ContainerDied","Data":"26fb2e8078bb336508bc5a22fd0ab0c4b6642dbfb726f8e0f3404f3cce8117b4"} Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.694700 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26fb2e8078bb336508bc5a22fd0ab0c4b6642dbfb726f8e0f3404f3cce8117b4" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.695716 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790227 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790676 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790747 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790850 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790875 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.790973 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.791018 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqw9v\" (UniqueName: \"kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v\") pod \"1df5a151-2186-4252-b04e-148a055b6a9d\" (UID: \"1df5a151-2186-4252-b04e-148a055b6a9d\") " Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.794336 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.795459 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.824475 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts" (OuterVolumeSpecName: "scripts") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.831013 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v" (OuterVolumeSpecName: "kube-api-access-fqw9v") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "kube-api-access-fqw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.847421 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.895534 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.900870 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.902864 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.902971 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqw9v\" (UniqueName: \"kubernetes.io/projected/1df5a151-2186-4252-b04e-148a055b6a9d-kube-api-access-fqw9v\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.903075 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1df5a151-2186-4252-b04e-148a055b6a9d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.937971 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data" (OuterVolumeSpecName: "config-data") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:23 crc kubenswrapper[4731]: I1203 19:13:23.944683 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1df5a151-2186-4252-b04e-148a055b6a9d" (UID: "1df5a151-2186-4252-b04e-148a055b6a9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.005471 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.005512 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df5a151-2186-4252-b04e-148a055b6a9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.707716 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerStarted","Data":"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e"} Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.712076 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.717405 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerStarted","Data":"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126"} Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.717637 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api-log" containerID="cri-o://9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4" gracePeriod=30 Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.717685 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.717741 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api" containerID="cri-o://9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126" gracePeriod=30 Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.745699 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.007103081 podStartE2EDuration="5.745678584s" podCreationTimestamp="2025-12-03 19:13:19 +0000 UTC" firstStartedPulling="2025-12-03 19:13:20.583708781 +0000 UTC m=+1121.182303245" lastFinishedPulling="2025-12-03 19:13:21.322284284 +0000 UTC m=+1121.920878748" observedRunningTime="2025-12-03 19:13:24.73664496 +0000 UTC m=+1125.335239444" watchObservedRunningTime="2025-12-03 19:13:24.745678584 +0000 UTC m=+1125.344273048" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.759527 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.772452 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.776212 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.77619365 podStartE2EDuration="4.77619365s" podCreationTimestamp="2025-12-03 19:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:24.773793785 +0000 UTC m=+1125.372388249" watchObservedRunningTime="2025-12-03 19:13:24.77619365 +0000 UTC m=+1125.374788114" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.807243 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:24 crc kubenswrapper[4731]: E1203 19:13:24.808690 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-notification-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.808716 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-notification-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: E1203 19:13:24.808735 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-central-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.808742 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-central-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: E1203 19:13:24.808757 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="proxy-httpd" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.808764 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="proxy-httpd" Dec 03 19:13:24 crc kubenswrapper[4731]: E1203 19:13:24.808774 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="sg-core" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.808779 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="sg-core" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.809012 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-notification-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.809029 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="ceilometer-central-agent" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.809039 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="proxy-httpd" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.809050 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" containerName="sg-core" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.810902 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.814247 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.815232 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.847368 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.874855 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.892677 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932539 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44sd\" (UniqueName: \"kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932641 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932683 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932749 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932767 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932831 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:24 crc kubenswrapper[4731]: I1203 19:13:24.932851 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035156 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035303 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035329 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035372 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44sd\" (UniqueName: \"kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035434 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035473 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035514 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.035981 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.042805 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.047266 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.076421 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.076951 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.077886 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.084127 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44sd\" (UniqueName: \"kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd\") pod \"ceilometer-0\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.176616 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.191329 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.723759 4731 generic.go:334] "Generic (PLEG): container finished" podID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerID="9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4" exitCode=143 Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.723997 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerDied","Data":"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4"} Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.792816 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:25 crc kubenswrapper[4731]: I1203 19:13:25.875815 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df5a151-2186-4252-b04e-148a055b6a9d" path="/var/lib/kubelet/pods/1df5a151-2186-4252-b04e-148a055b6a9d/volumes" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.468452 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.468860 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.468924 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.469868 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.469931 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47" gracePeriod=600 Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.743911 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47" exitCode=0 Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.744062 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47"} Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.744534 4731 scope.go:117] "RemoveContainer" containerID="af854c9d3b32f450920436e43467df2efcb54fae683ee960492021d62f140295" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.749485 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerStarted","Data":"1c26e9d9bf05e4dd0f6fa3c7afacc5f4639ea779bad7bc070918a0682fdddf2f"} Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.749540 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerStarted","Data":"886c63b839f5fd2670193fc125f78226519395c162ce4a2f7f5fa8daea0eb94a"} Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.788981 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55489cbdc4-8kvp2" Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.873763 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.875120 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77f589d474-ffcg5" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api-log" containerID="cri-o://f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c" gracePeriod=30 Dec 03 19:13:26 crc kubenswrapper[4731]: I1203 19:13:26.875699 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77f589d474-ffcg5" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api" containerID="cri-o://85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31" gracePeriod=30 Dec 03 19:13:27 crc kubenswrapper[4731]: I1203 19:13:27.760376 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerStarted","Data":"f1644da2565236579ad8b309e5c788ba15bf623963231696b5babb64189bc778"} Dec 03 19:13:27 crc kubenswrapper[4731]: I1203 19:13:27.763173 4731 generic.go:334] "Generic (PLEG): container finished" podID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerID="f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c" exitCode=143 Dec 03 19:13:27 crc kubenswrapper[4731]: I1203 19:13:27.763280 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerDied","Data":"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c"} Dec 03 19:13:27 crc kubenswrapper[4731]: I1203 19:13:27.766300 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0"} Dec 03 19:13:28 crc kubenswrapper[4731]: I1203 19:13:28.778672 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerStarted","Data":"48aa671f3f1fd2f4eb44075a1ebe40ed2ffd07a5d8cf3c2a40dd9e1991172664"} Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.047069 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77f589d474-ffcg5" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": read tcp 10.217.0.2:44760->10.217.0.149:9311: read: connection reset by peer" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.047112 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77f589d474-ffcg5" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": read tcp 10.217.0.2:44766->10.217.0.149:9311: read: connection reset by peer" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.454004 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.500032 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.543126 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.589460 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdnrm\" (UniqueName: \"kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm\") pod \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.589516 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs\") pod \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.589639 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data\") pod \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.589729 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle\") pod \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.589917 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom\") pod \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\" (UID: \"6002841f-20eb-4b1c-b3bd-fb237cf3833a\") " Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.590443 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs" (OuterVolumeSpecName: "logs") pod "6002841f-20eb-4b1c-b3bd-fb237cf3833a" (UID: "6002841f-20eb-4b1c-b3bd-fb237cf3833a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.597114 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6002841f-20eb-4b1c-b3bd-fb237cf3833a" (UID: "6002841f-20eb-4b1c-b3bd-fb237cf3833a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.597411 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm" (OuterVolumeSpecName: "kube-api-access-mdnrm") pod "6002841f-20eb-4b1c-b3bd-fb237cf3833a" (UID: "6002841f-20eb-4b1c-b3bd-fb237cf3833a"). InnerVolumeSpecName "kube-api-access-mdnrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.635166 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6002841f-20eb-4b1c-b3bd-fb237cf3833a" (UID: "6002841f-20eb-4b1c-b3bd-fb237cf3833a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.665497 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data" (OuterVolumeSpecName: "config-data") pod "6002841f-20eb-4b1c-b3bd-fb237cf3833a" (UID: "6002841f-20eb-4b1c-b3bd-fb237cf3833a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.691663 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.691709 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.691722 4731 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002841f-20eb-4b1c-b3bd-fb237cf3833a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.691732 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdnrm\" (UniqueName: \"kubernetes.io/projected/6002841f-20eb-4b1c-b3bd-fb237cf3833a-kube-api-access-mdnrm\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.691742 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002841f-20eb-4b1c-b3bd-fb237cf3833a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.813191 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerStarted","Data":"f38304e14069560b753b05dd7ebc0cc3206ee0bc39b4c709f11328ce4cdf391d"} Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.816275 4731 generic.go:334] "Generic (PLEG): container finished" podID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerID="85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31" exitCode=0 Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.816364 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77f589d474-ffcg5" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.816903 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="cinder-scheduler" containerID="cri-o://575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1" gracePeriod=30 Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.817022 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="probe" containerID="cri-o://f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e" gracePeriod=30 Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.816515 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.817588 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerDied","Data":"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31"} Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.817628 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77f589d474-ffcg5" event={"ID":"6002841f-20eb-4b1c-b3bd-fb237cf3833a","Type":"ContainerDied","Data":"643ccac1e98aca3f257e5dbd1e4dde0ff10a837f1f03cfb0f8ebaa0fb2f9a406"} Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.817660 4731 scope.go:117] "RemoveContainer" containerID="85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.840543 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.026663038 podStartE2EDuration="6.840510901s" podCreationTimestamp="2025-12-03 19:13:24 +0000 UTC" firstStartedPulling="2025-12-03 19:13:25.782680028 +0000 UTC m=+1126.381274482" lastFinishedPulling="2025-12-03 19:13:29.596527851 +0000 UTC m=+1130.195122345" observedRunningTime="2025-12-03 19:13:30.838064155 +0000 UTC m=+1131.436658669" watchObservedRunningTime="2025-12-03 19:13:30.840510901 +0000 UTC m=+1131.439105405" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.854654 4731 scope.go:117] "RemoveContainer" containerID="f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.876417 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.878747 4731 scope.go:117] "RemoveContainer" containerID="85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31" Dec 03 19:13:30 crc kubenswrapper[4731]: E1203 19:13:30.879327 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31\": container with ID starting with 85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31 not found: ID does not exist" containerID="85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.879366 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31"} err="failed to get container status \"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31\": rpc error: code = NotFound desc = could not find container \"85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31\": container with ID starting with 85481fe92dfa5034b19ce36c622055e8495e1e9eb7bb0080654440e34b68da31 not found: ID does not exist" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.879391 4731 scope.go:117] "RemoveContainer" containerID="f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c" Dec 03 19:13:30 crc kubenswrapper[4731]: E1203 19:13:30.879686 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c\": container with ID starting with f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c not found: ID does not exist" containerID="f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.879709 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c"} err="failed to get container status \"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c\": rpc error: code = NotFound desc = could not find container \"f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c\": container with ID starting with f07c402abe6ed4d161741a03fb3350043138d5c0e07d2de9032316842ba3fd2c not found: ID does not exist" Dec 03 19:13:30 crc kubenswrapper[4731]: I1203 19:13:30.886270 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77f589d474-ffcg5"] Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.076269 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-748fc65857-2r69r" Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.154575 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.154844 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84487d7f96-f2jmd" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-api" containerID="cri-o://1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319" gracePeriod=30 Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.155014 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84487d7f96-f2jmd" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-httpd" containerID="cri-o://26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44" gracePeriod=30 Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.857187 4731 generic.go:334] "Generic (PLEG): container finished" podID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerID="f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e" exitCode=0 Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.859622 4731 generic.go:334] "Generic (PLEG): container finished" podID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerID="26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44" exitCode=0 Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.868303 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" path="/var/lib/kubelet/pods/6002841f-20eb-4b1c-b3bd-fb237cf3833a/volumes" Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.869052 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerDied","Data":"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e"} Dec 03 19:13:31 crc kubenswrapper[4731]: I1203 19:13:31.869087 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerDied","Data":"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44"} Dec 03 19:13:32 crc kubenswrapper[4731]: I1203 19:13:32.865168 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 19:13:33 crc kubenswrapper[4731]: I1203 19:13:33.493005 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:13:34 crc kubenswrapper[4731]: I1203 19:13:34.003136 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.602322 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-787778d6bb-5glsd" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.754827 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.816539 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.920148 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924225 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924373 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhz8w\" (UniqueName: \"kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924455 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924507 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924576 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data\") pod \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\" (UID: \"1c49ccc8-b3e2-4065-97e6-b789aac2fa69\") " Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.924646 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.925919 4731 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.927597 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.928774 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w" (OuterVolumeSpecName: "kube-api-access-vhz8w") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "kube-api-access-vhz8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.950699 4731 generic.go:334] "Generic (PLEG): container finished" podID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerID="575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1" exitCode=0 Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.950757 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerDied","Data":"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1"} Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.950801 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c49ccc8-b3e2-4065-97e6-b789aac2fa69","Type":"ContainerDied","Data":"60b41764c58b38aee9197d61ae2c3d496a2ed9f7c38c5b0ed033336d1cfb20e8"} Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.950824 4731 scope.go:117] "RemoveContainer" containerID="f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.950986 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.954390 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts" (OuterVolumeSpecName: "scripts") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:35 crc kubenswrapper[4731]: I1203 19:13:35.994705 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.027769 4731 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.027809 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.027823 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhz8w\" (UniqueName: \"kubernetes.io/projected/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-kube-api-access-vhz8w\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.027838 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.062720 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data" (OuterVolumeSpecName: "config-data") pod "1c49ccc8-b3e2-4065-97e6-b789aac2fa69" (UID: "1c49ccc8-b3e2-4065-97e6-b789aac2fa69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.130148 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c49ccc8-b3e2-4065-97e6-b789aac2fa69-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.204579 4731 scope.go:117] "RemoveContainer" containerID="575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.238005 4731 scope.go:117] "RemoveContainer" containerID="f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e" Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.239051 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e\": container with ID starting with f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e not found: ID does not exist" containerID="f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.239088 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e"} err="failed to get container status \"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e\": rpc error: code = NotFound desc = could not find container \"f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e\": container with ID starting with f43d4517d920361a99d042586071c881feb8236249c852459758c86f728add9e not found: ID does not exist" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.239114 4731 scope.go:117] "RemoveContainer" containerID="575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1" Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.239630 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1\": container with ID starting with 575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1 not found: ID does not exist" containerID="575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.239657 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1"} err="failed to get container status \"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1\": rpc error: code = NotFound desc = could not find container \"575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1\": container with ID starting with 575b7d1a585d58663f75917ac48e1ba5965f789daa93e96cadfaf6e00e52d5e1 not found: ID does not exist" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.248902 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7578458cb-br8st" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.344326 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.379052 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.387601 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.388160 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="probe" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388185 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="probe" Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.388201 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388209 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api" Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.388266 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api-log" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388276 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api-log" Dec 03 19:13:36 crc kubenswrapper[4731]: E1203 19:13:36.388304 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="cinder-scheduler" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388313 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="cinder-scheduler" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388547 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="cinder-scheduler" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388573 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388587 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" containerName="probe" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.388963 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002841f-20eb-4b1c-b3bd-fb237cf3833a" containerName="barbican-api-log" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.391963 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.400684 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.407943 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.408366 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon-log" containerID="cri-o://cbf678eabf96eed394c8395472bdf57b49dfe6cd318626870fa3106aed7d89b5" gracePeriod=30 Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.408509 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" containerID="cri-o://b824719360d5c598d6d63177940c2fff30207b4b666481a901664d9b35d6dd38" gracePeriod=30 Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.434501 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.450875 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-scripts\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.451167 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdrc\" (UniqueName: \"kubernetes.io/projected/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-kube-api-access-2hdrc\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.453421 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.453550 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.453607 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.453669 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.555579 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdrc\" (UniqueName: \"kubernetes.io/projected/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-kube-api-access-2hdrc\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.555917 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.558028 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.558071 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.558120 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.558188 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-scripts\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.559163 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.561870 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.562964 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.563810 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.571715 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-scripts\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.573787 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdrc\" (UniqueName: \"kubernetes.io/projected/d8d6f074-0653-495e-8e09-b0cbc71e7e0a-kube-api-access-2hdrc\") pod \"cinder-scheduler-0\" (UID: \"d8d6f074-0653-495e-8e09-b0cbc71e7e0a\") " pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.733188 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.832160 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.866012 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs\") pod \"f7f0840d-a586-488a-badd-4439c97a8f1d\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.866197 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsqhv\" (UniqueName: \"kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv\") pod \"f7f0840d-a586-488a-badd-4439c97a8f1d\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.866301 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle\") pod \"f7f0840d-a586-488a-badd-4439c97a8f1d\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.866356 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config\") pod \"f7f0840d-a586-488a-badd-4439c97a8f1d\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.866413 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config\") pod \"f7f0840d-a586-488a-badd-4439c97a8f1d\" (UID: \"f7f0840d-a586-488a-badd-4439c97a8f1d\") " Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.886461 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv" (OuterVolumeSpecName: "kube-api-access-gsqhv") pod "f7f0840d-a586-488a-badd-4439c97a8f1d" (UID: "f7f0840d-a586-488a-badd-4439c97a8f1d"). InnerVolumeSpecName "kube-api-access-gsqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.891460 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f7f0840d-a586-488a-badd-4439c97a8f1d" (UID: "f7f0840d-a586-488a-badd-4439c97a8f1d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.955439 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config" (OuterVolumeSpecName: "config") pod "f7f0840d-a586-488a-badd-4439c97a8f1d" (UID: "f7f0840d-a586-488a-badd-4439c97a8f1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.968611 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.968647 4731 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.968679 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsqhv\" (UniqueName: \"kubernetes.io/projected/f7f0840d-a586-488a-badd-4439c97a8f1d-kube-api-access-gsqhv\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.973135 4731 generic.go:334] "Generic (PLEG): container finished" podID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerID="1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319" exitCode=0 Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.973379 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerDied","Data":"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319"} Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.973474 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84487d7f96-f2jmd" event={"ID":"f7f0840d-a586-488a-badd-4439c97a8f1d","Type":"ContainerDied","Data":"eaeb841a4e38e245deb30c52e6f7317e04c0043137c1be520774a226d25dbde6"} Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.973553 4731 scope.go:117] "RemoveContainer" containerID="26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44" Dec 03 19:13:36 crc kubenswrapper[4731]: I1203 19:13:36.973817 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84487d7f96-f2jmd" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.005017 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7f0840d-a586-488a-badd-4439c97a8f1d" (UID: "f7f0840d-a586-488a-badd-4439c97a8f1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.015452 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f7f0840d-a586-488a-badd-4439c97a8f1d" (UID: "f7f0840d-a586-488a-badd-4439c97a8f1d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.070508 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.070550 4731 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7f0840d-a586-488a-badd-4439c97a8f1d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.102198 4731 scope.go:117] "RemoveContainer" containerID="1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.135150 4731 scope.go:117] "RemoveContainer" containerID="26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44" Dec 03 19:13:37 crc kubenswrapper[4731]: E1203 19:13:37.135822 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44\": container with ID starting with 26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44 not found: ID does not exist" containerID="26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.135968 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44"} err="failed to get container status \"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44\": rpc error: code = NotFound desc = could not find container \"26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44\": container with ID starting with 26b82cd12c0c7e6b3fa38e0280ea9c646ae5247932d13646708a926c4aabdf44 not found: ID does not exist" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.136067 4731 scope.go:117] "RemoveContainer" containerID="1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319" Dec 03 19:13:37 crc kubenswrapper[4731]: E1203 19:13:37.136639 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319\": container with ID starting with 1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319 not found: ID does not exist" containerID="1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.136768 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319"} err="failed to get container status \"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319\": rpc error: code = NotFound desc = could not find container \"1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319\": container with ID starting with 1f1c703dd50c19f3a0b49d557be42a616c923bdd2d41c8ba4fa48306ae9da319 not found: ID does not exist" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.346863 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.355803 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84487d7f96-f2jmd"] Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.364843 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.869591 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c49ccc8-b3e2-4065-97e6-b789aac2fa69" path="/var/lib/kubelet/pods/1c49ccc8-b3e2-4065-97e6-b789aac2fa69/volumes" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.870918 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" path="/var/lib/kubelet/pods/f7f0840d-a586-488a-badd-4439c97a8f1d/volumes" Dec 03 19:13:37 crc kubenswrapper[4731]: I1203 19:13:37.986749 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d8d6f074-0653-495e-8e09-b0cbc71e7e0a","Type":"ContainerStarted","Data":"45932e8f26108af90d1f6752bfa18139079f29ae3763e66ed99912bfce3ca7be"} Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.225418 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 19:13:38 crc kubenswrapper[4731]: E1203 19:13:38.226317 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-api" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.226331 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-api" Dec 03 19:13:38 crc kubenswrapper[4731]: E1203 19:13:38.226375 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-httpd" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.226382 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-httpd" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.226562 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-httpd" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.226589 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f0840d-a586-488a-badd-4439c97a8f1d" containerName="neutron-api" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.227226 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.231781 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.231842 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.232140 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mvngm" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.245981 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.319556 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.319672 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.319759 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.319791 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdm5\" (UniqueName: \"kubernetes.io/projected/6e513fe3-52a0-403d-a6d5-b76e905e55e0-kube-api-access-jrdm5\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.421914 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.422012 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.422050 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdm5\" (UniqueName: \"kubernetes.io/projected/6e513fe3-52a0-403d-a6d5-b76e905e55e0-kube-api-access-jrdm5\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.422094 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.423113 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.427489 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.429070 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6e513fe3-52a0-403d-a6d5-b76e905e55e0-openstack-config-secret\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.441784 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdm5\" (UniqueName: \"kubernetes.io/projected/6e513fe3-52a0-403d-a6d5-b76e905e55e0-kube-api-access-jrdm5\") pod \"openstackclient\" (UID: \"6e513fe3-52a0-403d-a6d5-b76e905e55e0\") " pod="openstack/openstackclient" Dec 03 19:13:38 crc kubenswrapper[4731]: I1203 19:13:38.552078 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.000221 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d8d6f074-0653-495e-8e09-b0cbc71e7e0a","Type":"ContainerStarted","Data":"251169d45c95889064c90ed9399e341c597b4347d180b9b5fd930fd48f25672c"} Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.000907 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d8d6f074-0653-495e-8e09-b0cbc71e7e0a","Type":"ContainerStarted","Data":"4c0f2da1e39c14b0090443600c23924cd5ad9cb412d28e0d705fa2dc9655aa47"} Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.117653 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.117632019 podStartE2EDuration="3.117632019s" podCreationTimestamp="2025-12-03 19:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:39.030512029 +0000 UTC m=+1139.629106493" watchObservedRunningTime="2025-12-03 19:13:39.117632019 +0000 UTC m=+1139.716226473" Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.122036 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.802432 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:39 crc kubenswrapper[4731]: I1203 19:13:39.807688 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f7d595d8d-wt4nd" Dec 03 19:13:40 crc kubenswrapper[4731]: I1203 19:13:40.019733 4731 generic.go:334] "Generic (PLEG): container finished" podID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerID="b824719360d5c598d6d63177940c2fff30207b4b666481a901664d9b35d6dd38" exitCode=0 Dec 03 19:13:40 crc kubenswrapper[4731]: I1203 19:13:40.019810 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerDied","Data":"b824719360d5c598d6d63177940c2fff30207b4b666481a901664d9b35d6dd38"} Dec 03 19:13:40 crc kubenswrapper[4731]: I1203 19:13:40.022755 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6e513fe3-52a0-403d-a6d5-b76e905e55e0","Type":"ContainerStarted","Data":"493024c9661445c37998b47aaf92a430d05041bc6103e03b183d4ab156bacc89"} Dec 03 19:13:41 crc kubenswrapper[4731]: I1203 19:13:41.079747 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Dec 03 19:13:41 crc kubenswrapper[4731]: I1203 19:13:41.734637 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.556932 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.557732 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="proxy-httpd" containerID="cri-o://f38304e14069560b753b05dd7ebc0cc3206ee0bc39b4c709f11328ce4cdf391d" gracePeriod=30 Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.557835 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-notification-agent" containerID="cri-o://f1644da2565236579ad8b309e5c788ba15bf623963231696b5babb64189bc778" gracePeriod=30 Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.557902 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-central-agent" containerID="cri-o://1c26e9d9bf05e4dd0f6fa3c7afacc5f4639ea779bad7bc070918a0682fdddf2f" gracePeriod=30 Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.558232 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="sg-core" containerID="cri-o://48aa671f3f1fd2f4eb44075a1ebe40ed2ffd07a5d8cf3c2a40dd9e1991172664" gracePeriod=30 Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.588399 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.600478 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5cdc55748f-txbzr"] Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.602183 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.605625 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.605884 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.606134 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.615420 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cdc55748f-txbzr"] Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.691034 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzc8r\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-kube-api-access-nzc8r\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.691513 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-log-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.691559 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-public-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.691587 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-config-data\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.691869 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-internal-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.692169 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-etc-swift\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.692203 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-combined-ca-bundle\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.692416 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-run-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801318 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-public-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801434 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-config-data\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801508 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-internal-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801578 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-etc-swift\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801596 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-combined-ca-bundle\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801639 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-run-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzc8r\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-kube-api-access-nzc8r\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.801748 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-log-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.802392 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-log-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.803574 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-run-httpd\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.812506 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-config-data\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.813317 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-internal-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.819253 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-combined-ca-bundle\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.822929 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-etc-swift\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.822981 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzc8r\" (UniqueName: \"kubernetes.io/projected/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-kube-api-access-nzc8r\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.824068 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1326b67b-b6eb-476f-957b-3f6f2ba94ec5-public-tls-certs\") pod \"swift-proxy-5cdc55748f-txbzr\" (UID: \"1326b67b-b6eb-476f-957b-3f6f2ba94ec5\") " pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:43 crc kubenswrapper[4731]: I1203 19:13:43.939132 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:44 crc kubenswrapper[4731]: I1203 19:13:44.093129 4731 generic.go:334] "Generic (PLEG): container finished" podID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerID="f38304e14069560b753b05dd7ebc0cc3206ee0bc39b4c709f11328ce4cdf391d" exitCode=0 Dec 03 19:13:44 crc kubenswrapper[4731]: I1203 19:13:44.093168 4731 generic.go:334] "Generic (PLEG): container finished" podID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerID="48aa671f3f1fd2f4eb44075a1ebe40ed2ffd07a5d8cf3c2a40dd9e1991172664" exitCode=2 Dec 03 19:13:44 crc kubenswrapper[4731]: I1203 19:13:44.093196 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerDied","Data":"f38304e14069560b753b05dd7ebc0cc3206ee0bc39b4c709f11328ce4cdf391d"} Dec 03 19:13:44 crc kubenswrapper[4731]: I1203 19:13:44.093234 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerDied","Data":"48aa671f3f1fd2f4eb44075a1ebe40ed2ffd07a5d8cf3c2a40dd9e1991172664"} Dec 03 19:13:45 crc kubenswrapper[4731]: I1203 19:13:45.116964 4731 generic.go:334] "Generic (PLEG): container finished" podID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerID="1c26e9d9bf05e4dd0f6fa3c7afacc5f4639ea779bad7bc070918a0682fdddf2f" exitCode=0 Dec 03 19:13:45 crc kubenswrapper[4731]: I1203 19:13:45.117187 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerDied","Data":"1c26e9d9bf05e4dd0f6fa3c7afacc5f4639ea779bad7bc070918a0682fdddf2f"} Dec 03 19:13:46 crc kubenswrapper[4731]: I1203 19:13:46.149476 4731 generic.go:334] "Generic (PLEG): container finished" podID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerID="f1644da2565236579ad8b309e5c788ba15bf623963231696b5babb64189bc778" exitCode=0 Dec 03 19:13:46 crc kubenswrapper[4731]: I1203 19:13:46.149554 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerDied","Data":"f1644da2565236579ad8b309e5c788ba15bf623963231696b5babb64189bc778"} Dec 03 19:13:47 crc kubenswrapper[4731]: I1203 19:13:47.013404 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 19:13:47 crc kubenswrapper[4731]: I1203 19:13:47.911171 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:47 crc kubenswrapper[4731]: I1203 19:13:47.912748 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerName="kube-state-metrics" containerID="cri-o://9f7592975c03bfb2b554ae6cd8fe2dcd239a395e0d58e02e9a8a319b14e61f0b" gracePeriod=30 Dec 03 19:13:48 crc kubenswrapper[4731]: I1203 19:13:48.216954 4731 generic.go:334] "Generic (PLEG): container finished" podID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerID="9f7592975c03bfb2b554ae6cd8fe2dcd239a395e0d58e02e9a8a319b14e61f0b" exitCode=2 Dec 03 19:13:48 crc kubenswrapper[4731]: I1203 19:13:48.217022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a549a4d-e989-42bf-8e74-556e6feb9507","Type":"ContainerDied","Data":"9f7592975c03bfb2b554ae6cd8fe2dcd239a395e0d58e02e9a8a319b14e61f0b"} Dec 03 19:13:48 crc kubenswrapper[4731]: I1203 19:13:48.989562 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": dial tcp 10.217.0.100:8081: connect: connection refused" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.172490 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.243312 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a549a4d-e989-42bf-8e74-556e6feb9507","Type":"ContainerDied","Data":"9e857202732871b12e49db354f409e4a09f31780c6d670bfb5969ca88f85d937"} Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.243418 4731 scope.go:117] "RemoveContainer" containerID="9f7592975c03bfb2b554ae6cd8fe2dcd239a395e0d58e02e9a8a319b14e61f0b" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.243658 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.251139 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6e513fe3-52a0-403d-a6d5-b76e905e55e0","Type":"ContainerStarted","Data":"549e49e58baa7bfbce28edcb9ddeaf2dea543eae190846fb9b8f93de665f9ed0"} Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.270712 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwtfm\" (UniqueName: \"kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm\") pod \"8a549a4d-e989-42bf-8e74-556e6feb9507\" (UID: \"8a549a4d-e989-42bf-8e74-556e6feb9507\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.276225 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm" (OuterVolumeSpecName: "kube-api-access-rwtfm") pod "8a549a4d-e989-42bf-8e74-556e6feb9507" (UID: "8a549a4d-e989-42bf-8e74-556e6feb9507"). InnerVolumeSpecName "kube-api-access-rwtfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.292706 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.506365661 podStartE2EDuration="12.292682422s" podCreationTimestamp="2025-12-03 19:13:38 +0000 UTC" firstStartedPulling="2025-12-03 19:13:39.127473908 +0000 UTC m=+1139.726068372" lastFinishedPulling="2025-12-03 19:13:49.913790669 +0000 UTC m=+1150.512385133" observedRunningTime="2025-12-03 19:13:50.287226621 +0000 UTC m=+1150.885821085" watchObservedRunningTime="2025-12-03 19:13:50.292682422 +0000 UTC m=+1150.891276886" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.376229 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwtfm\" (UniqueName: \"kubernetes.io/projected/8a549a4d-e989-42bf-8e74-556e6feb9507-kube-api-access-rwtfm\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.391590 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.476938 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477124 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477167 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477198 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44sd\" (UniqueName: \"kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477247 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477383 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477420 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data\") pod \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\" (UID: \"05af2ccb-603d-40f9-a837-1c9c19d3f1cf\") " Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.477943 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.478333 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.487955 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd" (OuterVolumeSpecName: "kube-api-access-p44sd") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "kube-api-access-p44sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.500561 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts" (OuterVolumeSpecName: "scripts") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.575840 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.580718 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.580757 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.580769 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.580797 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.580808 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44sd\" (UniqueName: \"kubernetes.io/projected/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-kube-api-access-p44sd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: W1203 19:13:50.590382 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1326b67b_b6eb_476f_957b_3f6f2ba94ec5.slice/crio-e4fe093f55d5cdc508342bbc9990f30f8b9ea5718d44c1738269d136c2754954 WatchSource:0}: Error finding container e4fe093f55d5cdc508342bbc9990f30f8b9ea5718d44c1738269d136c2754954: Status 404 returned error can't find the container with id e4fe093f55d5cdc508342bbc9990f30f8b9ea5718d44c1738269d136c2754954 Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.602179 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cdc55748f-txbzr"] Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.619881 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.640417 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.647556 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657021 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:50 crc kubenswrapper[4731]: E1203 19:13:50.657746 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="proxy-httpd" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657772 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="proxy-httpd" Dec 03 19:13:50 crc kubenswrapper[4731]: E1203 19:13:50.657821 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-notification-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657831 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-notification-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: E1203 19:13:50.657848 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerName="kube-state-metrics" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657857 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerName="kube-state-metrics" Dec 03 19:13:50 crc kubenswrapper[4731]: E1203 19:13:50.657872 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="sg-core" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657881 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="sg-core" Dec 03 19:13:50 crc kubenswrapper[4731]: E1203 19:13:50.657923 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-central-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.657933 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-central-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.658271 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="sg-core" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.658294 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" containerName="kube-state-metrics" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.658322 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-notification-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.658334 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="proxy-httpd" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.658352 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" containerName="ceilometer-central-agent" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.659310 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.659976 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data" (OuterVolumeSpecName: "config-data") pod "05af2ccb-603d-40f9-a837-1c9c19d3f1cf" (UID: "05af2ccb-603d-40f9-a837-1c9c19d3f1cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.665666 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.666071 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.681565 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.683762 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.683797 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05af2ccb-603d-40f9-a837-1c9c19d3f1cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.785458 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.785516 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.785542 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mssgq\" (UniqueName: \"kubernetes.io/projected/4b88a816-d7c7-4608-b062-f8b9432af359-kube-api-access-mssgq\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.785841 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.888328 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.888408 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.888438 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mssgq\" (UniqueName: \"kubernetes.io/projected/4b88a816-d7c7-4608-b062-f8b9432af359-kube-api-access-mssgq\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.888485 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.896076 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.896522 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.901764 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b88a816-d7c7-4608-b062-f8b9432af359-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.907709 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mssgq\" (UniqueName: \"kubernetes.io/projected/4b88a816-d7c7-4608-b062-f8b9432af359-kube-api-access-mssgq\") pod \"kube-state-metrics-0\" (UID: \"4b88a816-d7c7-4608-b062-f8b9432af359\") " pod="openstack/kube-state-metrics-0" Dec 03 19:13:50 crc kubenswrapper[4731]: I1203 19:13:50.989545 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.079614 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.265939 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cdc55748f-txbzr" event={"ID":"1326b67b-b6eb-476f-957b-3f6f2ba94ec5","Type":"ContainerStarted","Data":"febbaf694d31891e5cbb47182170b9aae494ab64e878c01e3bb08554e9568ada"} Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.266387 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.266402 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cdc55748f-txbzr" event={"ID":"1326b67b-b6eb-476f-957b-3f6f2ba94ec5","Type":"ContainerStarted","Data":"6e900566ca5d24119af134a2f992c46a8e4aefc6002a75a1ce599378a904e24a"} Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.266420 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.266430 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cdc55748f-txbzr" event={"ID":"1326b67b-b6eb-476f-957b-3f6f2ba94ec5","Type":"ContainerStarted","Data":"e4fe093f55d5cdc508342bbc9990f30f8b9ea5718d44c1738269d136c2754954"} Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.279784 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.280587 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05af2ccb-603d-40f9-a837-1c9c19d3f1cf","Type":"ContainerDied","Data":"886c63b839f5fd2670193fc125f78226519395c162ce4a2f7f5fa8daea0eb94a"} Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.280635 4731 scope.go:117] "RemoveContainer" containerID="f38304e14069560b753b05dd7ebc0cc3206ee0bc39b4c709f11328ce4cdf391d" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.316801 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5cdc55748f-txbzr" podStartSLOduration=8.316779321 podStartE2EDuration="8.316779321s" podCreationTimestamp="2025-12-03 19:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:51.312509096 +0000 UTC m=+1151.911103560" watchObservedRunningTime="2025-12-03 19:13:51.316779321 +0000 UTC m=+1151.915373785" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.332134 4731 scope.go:117] "RemoveContainer" containerID="48aa671f3f1fd2f4eb44075a1ebe40ed2ffd07a5d8cf3c2a40dd9e1991172664" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.337116 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.351668 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.365959 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.368976 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.370343 4731 scope.go:117] "RemoveContainer" containerID="f1644da2565236579ad8b309e5c788ba15bf623963231696b5babb64189bc778" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.373166 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.374332 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.374495 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.375792 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.405170 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.405720 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.405758 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.405887 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.406013 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.406161 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.406812 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnwd\" (UniqueName: \"kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.407115 4731 scope.go:117] "RemoveContainer" containerID="1c26e9d9bf05e4dd0f6fa3c7afacc5f4639ea779bad7bc070918a0682fdddf2f" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.408077 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: W1203 19:13:51.495694 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b88a816_d7c7_4608_b062_f8b9432af359.slice/crio-70625a430d1b4b5456ce5e73b3566734bce8caf0f186a2f00dc98ad00d01416e WatchSource:0}: Error finding container 70625a430d1b4b5456ce5e73b3566734bce8caf0f186a2f00dc98ad00d01416e: Status 404 returned error can't find the container with id 70625a430d1b4b5456ce5e73b3566734bce8caf0f186a2f00dc98ad00d01416e Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.496107 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.511291 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.511539 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnwd\" (UniqueName: \"kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.511725 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.511905 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.511914 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.512042 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.512082 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.512185 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.512236 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.513339 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.519069 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.520663 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.521014 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.523408 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.526322 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.529273 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnwd\" (UniqueName: \"kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd\") pod \"ceilometer-0\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.699811 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.804237 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pcjrh"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.808632 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.835006 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pcjrh"] Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.889616 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05af2ccb-603d-40f9-a837-1c9c19d3f1cf" path="/var/lib/kubelet/pods/05af2ccb-603d-40f9-a837-1c9c19d3f1cf/volumes" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.891170 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a549a4d-e989-42bf-8e74-556e6feb9507" path="/var/lib/kubelet/pods/8a549a4d-e989-42bf-8e74-556e6feb9507/volumes" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.923674 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl6xc\" (UniqueName: \"kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:51 crc kubenswrapper[4731]: I1203 19:13:51.923731 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.026344 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl6xc\" (UniqueName: \"kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.026401 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.027813 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.031150 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-539b-account-create-update-g5ghg"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.032657 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.039493 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.048179 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xtlvb"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.048835 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl6xc\" (UniqueName: \"kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc\") pod \"nova-api-db-create-pcjrh\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.049882 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.060724 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-539b-account-create-update-g5ghg"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.102512 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xtlvb"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.153509 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmstd\" (UniqueName: \"kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.158357 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.193003 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.193156 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.193375 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87lrz\" (UniqueName: \"kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.198097 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-s9jrx"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.237022 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.253429 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9jrx"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.295746 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c53f-account-create-update-vbgt5"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.297531 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.307943 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.309949 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.310014 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.310087 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87lrz\" (UniqueName: \"kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.310141 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.310181 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrdp\" (UniqueName: \"kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.310202 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmstd\" (UniqueName: \"kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.311324 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.317022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4b88a816-d7c7-4608-b062-f8b9432af359","Type":"ContainerStarted","Data":"6f506563ed2162daf54e1bc96a2c9e1373fc21e182836b11fca733221d0764d3"} Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.318994 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4b88a816-d7c7-4608-b062-f8b9432af359","Type":"ContainerStarted","Data":"70625a430d1b4b5456ce5e73b3566734bce8caf0f186a2f00dc98ad00d01416e"} Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.318147 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.342553 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87lrz\" (UniqueName: \"kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz\") pod \"nova-api-539b-account-create-update-g5ghg\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.342933 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmstd\" (UniqueName: \"kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd\") pod \"nova-cell0-db-create-xtlvb\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.357204 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c53f-account-create-update-vbgt5"] Dec 03 19:13:52 crc kubenswrapper[4731]: W1203 19:13:52.391772 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37efbdfd_77b5_4d8c_90cc_c8a6786f189a.slice/crio-28a1bebeed207950e7facfb3c3e5754e3b3a9d77da109234eef8127d635481e2 WatchSource:0}: Error finding container 28a1bebeed207950e7facfb3c3e5754e3b3a9d77da109234eef8127d635481e2: Status 404 returned error can't find the container with id 28a1bebeed207950e7facfb3c3e5754e3b3a9d77da109234eef8127d635481e2 Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.394971 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.418145 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.418219 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrdp\" (UniqueName: \"kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.418400 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbks\" (UniqueName: \"kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.418549 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.419535 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.434110 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-439b-account-create-update-28tkb"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.435511 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.438286 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.455662 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrdp\" (UniqueName: \"kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp\") pod \"nova-cell1-db-create-s9jrx\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.466428 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-439b-account-create-update-28tkb"] Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.497021 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.520756 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x9h\" (UniqueName: \"kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.520847 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.520928 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.520979 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhbks\" (UniqueName: \"kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.521965 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.547057 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhbks\" (UniqueName: \"kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks\") pod \"nova-cell0-c53f-account-create-update-vbgt5\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.585036 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.621776 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.622549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.622703 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x9h\" (UniqueName: \"kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.623898 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.642642 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.654937 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x9h\" (UniqueName: \"kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h\") pod \"nova-cell1-439b-account-create-update-28tkb\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.768486 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:52 crc kubenswrapper[4731]: I1203 19:13:52.976111 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pcjrh"] Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.021162 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.163150 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-539b-account-create-update-g5ghg"] Dec 03 19:13:53 crc kubenswrapper[4731]: W1203 19:13:53.185972 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1aa8978_06f7_4e6c_9e3b_3edbf20878bd.slice/crio-923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a WatchSource:0}: Error finding container 923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a: Status 404 returned error can't find the container with id 923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.345028 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-539b-account-create-update-g5ghg" event={"ID":"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd","Type":"ContainerStarted","Data":"923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a"} Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.347775 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pcjrh" event={"ID":"8229af75-4935-4593-b482-7c841a710cd9","Type":"ContainerStarted","Data":"0b54090a7d65d2ea49af1f053c88450ee6f65f08f0d4b78ba3987bf49cbf71fb"} Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.349952 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerStarted","Data":"28a1bebeed207950e7facfb3c3e5754e3b3a9d77da109234eef8127d635481e2"} Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.351101 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.379361 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.998472826 podStartE2EDuration="3.37934093s" podCreationTimestamp="2025-12-03 19:13:50 +0000 UTC" firstStartedPulling="2025-12-03 19:13:51.499706892 +0000 UTC m=+1152.098301356" lastFinishedPulling="2025-12-03 19:13:51.880574996 +0000 UTC m=+1152.479169460" observedRunningTime="2025-12-03 19:13:53.372499936 +0000 UTC m=+1153.971094400" watchObservedRunningTime="2025-12-03 19:13:53.37934093 +0000 UTC m=+1153.977935394" Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.416370 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xtlvb"] Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.623901 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c53f-account-create-update-vbgt5"] Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.640109 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9jrx"] Dec 03 19:13:53 crc kubenswrapper[4731]: I1203 19:13:53.785511 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-439b-account-create-update-28tkb"] Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.369586 4731 generic.go:334] "Generic (PLEG): container finished" podID="6934159f-6d4b-4243-a5c3-0092fe5f58c4" containerID="efdb1ff6a0fcba15ad7c8ec2da260a07ec9bcbbdab3c0c63643abf3612e90a4f" exitCode=0 Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.370018 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9jrx" event={"ID":"6934159f-6d4b-4243-a5c3-0092fe5f58c4","Type":"ContainerDied","Data":"efdb1ff6a0fcba15ad7c8ec2da260a07ec9bcbbdab3c0c63643abf3612e90a4f"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.370058 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9jrx" event={"ID":"6934159f-6d4b-4243-a5c3-0092fe5f58c4","Type":"ContainerStarted","Data":"e478f5a515cd80078799409c348b3e66dd775798a7b544b0747218422b21a14a"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.377552 4731 generic.go:334] "Generic (PLEG): container finished" podID="c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" containerID="727d85169cb5d22ec89098dd70aeedcb5893b3202cfa46965abdb2b302422cc6" exitCode=0 Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.377637 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-539b-account-create-update-g5ghg" event={"ID":"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd","Type":"ContainerDied","Data":"727d85169cb5d22ec89098dd70aeedcb5893b3202cfa46965abdb2b302422cc6"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.379520 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xtlvb" event={"ID":"86009e3d-4e12-4660-8104-b71fade8668f","Type":"ContainerStarted","Data":"5246cba9874413165f2d2d3b2375af0bc5ab26225941a46ef8a97fe67863cb92"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.379547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xtlvb" event={"ID":"86009e3d-4e12-4660-8104-b71fade8668f","Type":"ContainerStarted","Data":"c1d1d42c7cb5fd620947256c1e873883c8b3e358e2c7b1968f875cb0a2f149b1"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.394561 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" event={"ID":"d64c6991-7b2b-4432-8e99-ac9d1989b0ec","Type":"ContainerStarted","Data":"46feab51ee5e267f863d80bcdd090558775a80dc0fe9726b173c4f51b8b4d2d2"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.394647 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" event={"ID":"d64c6991-7b2b-4432-8e99-ac9d1989b0ec","Type":"ContainerStarted","Data":"d6c6ef47c8baa7f963ce983c6b25f67d9bc73761d488229aaa041de2761e193b"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.401848 4731 generic.go:334] "Generic (PLEG): container finished" podID="8229af75-4935-4593-b482-7c841a710cd9" containerID="badfb0e4c93ca53b210be7ff8c1211507828c501206f2e02920f6269450aa1bb" exitCode=0 Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.401960 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pcjrh" event={"ID":"8229af75-4935-4593-b482-7c841a710cd9","Type":"ContainerDied","Data":"badfb0e4c93ca53b210be7ff8c1211507828c501206f2e02920f6269450aa1bb"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.408895 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-439b-account-create-update-28tkb" event={"ID":"ff2992ad-2bee-42b4-8dcc-4584ffa3b336","Type":"ContainerStarted","Data":"a8d912d9e70a0de3d02a17fb58f5268c48a72eb303961127fe5ffc6be5f190fe"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.408924 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-439b-account-create-update-28tkb" event={"ID":"ff2992ad-2bee-42b4-8dcc-4584ffa3b336","Type":"ContainerStarted","Data":"f25d1e5c95d3bf5696d04afd0d764851e4f8c362c5d6b08fe83193adc6a80494"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.412943 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerStarted","Data":"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23"} Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.577010 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" podStartSLOduration=2.576957096 podStartE2EDuration="2.576957096s" podCreationTimestamp="2025-12-03 19:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:54.4941111 +0000 UTC m=+1155.092705564" watchObservedRunningTime="2025-12-03 19:13:54.576957096 +0000 UTC m=+1155.175551560" Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.588183 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.588531 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-log" containerID="cri-o://d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d" gracePeriod=30 Dec 03 19:13:54 crc kubenswrapper[4731]: I1203 19:13:54.588726 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-httpd" containerID="cri-o://2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916" gracePeriod=30 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.322957 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403053 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403159 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fbf\" (UniqueName: \"kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403466 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403503 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403641 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403692 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.403737 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id\") pod \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\" (UID: \"2e399280-7b1a-4c02-9ef6-787d93fe41c5\") " Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.404352 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.406634 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs" (OuterVolumeSpecName: "logs") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.428061 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts" (OuterVolumeSpecName: "scripts") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.428114 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.456296 4731 generic.go:334] "Generic (PLEG): container finished" podID="d64c6991-7b2b-4432-8e99-ac9d1989b0ec" containerID="46feab51ee5e267f863d80bcdd090558775a80dc0fe9726b173c4f51b8b4d2d2" exitCode=0 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.456414 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" event={"ID":"d64c6991-7b2b-4432-8e99-ac9d1989b0ec","Type":"ContainerDied","Data":"46feab51ee5e267f863d80bcdd090558775a80dc0fe9726b173c4f51b8b4d2d2"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.459096 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf" (OuterVolumeSpecName: "kube-api-access-r8fbf") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "kube-api-access-r8fbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.464610 4731 generic.go:334] "Generic (PLEG): container finished" podID="ff2992ad-2bee-42b4-8dcc-4584ffa3b336" containerID="a8d912d9e70a0de3d02a17fb58f5268c48a72eb303961127fe5ffc6be5f190fe" exitCode=0 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.464691 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-439b-account-create-update-28tkb" event={"ID":"ff2992ad-2bee-42b4-8dcc-4584ffa3b336","Type":"ContainerDied","Data":"a8d912d9e70a0de3d02a17fb58f5268c48a72eb303961127fe5ffc6be5f190fe"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.486576 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.497736 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerStarted","Data":"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.504361 4731 generic.go:334] "Generic (PLEG): container finished" podID="12509218-711b-46a4-a560-493ce03af965" containerID="d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d" exitCode=143 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.504435 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerDied","Data":"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.505937 4731 generic.go:334] "Generic (PLEG): container finished" podID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerID="9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126" exitCode=137 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.505988 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerDied","Data":"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506005 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e399280-7b1a-4c02-9ef6-787d93fe41c5","Type":"ContainerDied","Data":"c0e4b3df2a1952189bcb040528c72cd9fc09fb7694742dc1a26ade36094da95c"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506025 4731 scope.go:117] "RemoveContainer" containerID="9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506184 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506794 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506831 4731 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e399280-7b1a-4c02-9ef6-787d93fe41c5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506847 4731 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506860 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fbf\" (UniqueName: \"kubernetes.io/projected/2e399280-7b1a-4c02-9ef6-787d93fe41c5-kube-api-access-r8fbf\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506875 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.506888 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e399280-7b1a-4c02-9ef6-787d93fe41c5-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.520214 4731 generic.go:334] "Generic (PLEG): container finished" podID="86009e3d-4e12-4660-8104-b71fade8668f" containerID="5246cba9874413165f2d2d3b2375af0bc5ab26225941a46ef8a97fe67863cb92" exitCode=0 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.520796 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xtlvb" event={"ID":"86009e3d-4e12-4660-8104-b71fade8668f","Type":"ContainerDied","Data":"5246cba9874413165f2d2d3b2375af0bc5ab26225941a46ef8a97fe67863cb92"} Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.542358 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data" (OuterVolumeSpecName: "config-data") pod "2e399280-7b1a-4c02-9ef6-787d93fe41c5" (UID: "2e399280-7b1a-4c02-9ef6-787d93fe41c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.570558 4731 scope.go:117] "RemoveContainer" containerID="9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.609698 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e399280-7b1a-4c02-9ef6-787d93fe41c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.638612 4731 scope.go:117] "RemoveContainer" containerID="9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126" Dec 03 19:13:55 crc kubenswrapper[4731]: E1203 19:13:55.653272 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126\": container with ID starting with 9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126 not found: ID does not exist" containerID="9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.653354 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126"} err="failed to get container status \"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126\": rpc error: code = NotFound desc = could not find container \"9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126\": container with ID starting with 9a7ef2a2829350b6c117804b2cbdf94416a5839e8ddeaecfedbda931ddcd9126 not found: ID does not exist" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.653390 4731 scope.go:117] "RemoveContainer" containerID="9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4" Dec 03 19:13:55 crc kubenswrapper[4731]: E1203 19:13:55.659813 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4\": container with ID starting with 9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4 not found: ID does not exist" containerID="9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.659879 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4"} err="failed to get container status \"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4\": rpc error: code = NotFound desc = could not find container \"9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4\": container with ID starting with 9f1b57b92fec534eaf0dc9c4cd1f48c54c469f11297edaf30ffa01a48a47dbb4 not found: ID does not exist" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.811318 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.811794 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-log" containerID="cri-o://5fc8cd185db9f6cc74a3980df1d6bcbee6f797152e81faf499063ed07fa23458" gracePeriod=30 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.812953 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-httpd" containerID="cri-o://f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06" gracePeriod=30 Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.888374 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.889198 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.928009 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:55 crc kubenswrapper[4731]: E1203 19:13:55.928528 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api-log" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.928544 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api-log" Dec 03 19:13:55 crc kubenswrapper[4731]: E1203 19:13:55.928594 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.928601 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.928789 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.928805 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" containerName="cinder-api-log" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.929908 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.940902 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.941166 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.941306 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 19:13:55 crc kubenswrapper[4731]: I1203 19:13:55.945557 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.014966 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.032201 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.032897 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-public-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.032995 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00285379-3e6d-4b1c-9de5-d08bacd73c79-logs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.033091 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-scripts\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.034705 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.034820 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.035057 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00285379-3e6d-4b1c-9de5-d08bacd73c79-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.035854 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92qk4\" (UniqueName: \"kubernetes.io/projected/00285379-3e6d-4b1c-9de5-d08bacd73c79-kube-api-access-92qk4\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.036237 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data-custom\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138083 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9x9h\" (UniqueName: \"kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h\") pod \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138184 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts\") pod \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\" (UID: \"ff2992ad-2bee-42b4-8dcc-4584ffa3b336\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138735 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data-custom\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138832 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138867 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-public-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138886 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00285379-3e6d-4b1c-9de5-d08bacd73c79-logs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138911 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-scripts\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138926 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.138950 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.139042 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00285379-3e6d-4b1c-9de5-d08bacd73c79-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.139098 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92qk4\" (UniqueName: \"kubernetes.io/projected/00285379-3e6d-4b1c-9de5-d08bacd73c79-kube-api-access-92qk4\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.157101 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff2992ad-2bee-42b4-8dcc-4584ffa3b336" (UID: "ff2992ad-2bee-42b4-8dcc-4584ffa3b336"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.158903 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00285379-3e6d-4b1c-9de5-d08bacd73c79-logs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.159845 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-scripts\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.166081 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.166598 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92qk4\" (UniqueName: \"kubernetes.io/projected/00285379-3e6d-4b1c-9de5-d08bacd73c79-kube-api-access-92qk4\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.198383 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00285379-3e6d-4b1c-9de5-d08bacd73c79-etc-machine-id\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.206605 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.214882 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.218156 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h" (OuterVolumeSpecName: "kube-api-access-f9x9h") pod "ff2992ad-2bee-42b4-8dcc-4584ffa3b336" (UID: "ff2992ad-2bee-42b4-8dcc-4584ffa3b336"). InnerVolumeSpecName "kube-api-access-f9x9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.218858 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-public-tls-certs\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.221990 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00285379-3e6d-4b1c-9de5-d08bacd73c79-config-data-custom\") pod \"cinder-api-0\" (UID: \"00285379-3e6d-4b1c-9de5-d08bacd73c79\") " pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.241290 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.241341 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9x9h\" (UniqueName: \"kubernetes.io/projected/ff2992ad-2bee-42b4-8dcc-4584ffa3b336-kube-api-access-f9x9h\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.274374 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.281439 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.295752 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.298861 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.341454 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342147 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts\") pod \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342230 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrdp\" (UniqueName: \"kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp\") pod \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342370 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87lrz\" (UniqueName: \"kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz\") pod \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\" (UID: \"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342529 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts\") pod \"8229af75-4935-4593-b482-7c841a710cd9\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342575 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts\") pod \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\" (UID: \"6934159f-6d4b-4243-a5c3-0092fe5f58c4\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342634 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl6xc\" (UniqueName: \"kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc\") pod \"8229af75-4935-4593-b482-7c841a710cd9\" (UID: \"8229af75-4935-4593-b482-7c841a710cd9\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342692 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmstd\" (UniqueName: \"kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd\") pod \"86009e3d-4e12-4660-8104-b71fade8668f\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.342744 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts\") pod \"86009e3d-4e12-4660-8104-b71fade8668f\" (UID: \"86009e3d-4e12-4660-8104-b71fade8668f\") " Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.344948 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86009e3d-4e12-4660-8104-b71fade8668f" (UID: "86009e3d-4e12-4660-8104-b71fade8668f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.347134 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8229af75-4935-4593-b482-7c841a710cd9" (UID: "8229af75-4935-4593-b482-7c841a710cd9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.348605 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" (UID: "c1aa8978-06f7-4e6c-9e3b-3edbf20878bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.352792 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6934159f-6d4b-4243-a5c3-0092fe5f58c4" (UID: "6934159f-6d4b-4243-a5c3-0092fe5f58c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.354024 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp" (OuterVolumeSpecName: "kube-api-access-glrdp") pod "6934159f-6d4b-4243-a5c3-0092fe5f58c4" (UID: "6934159f-6d4b-4243-a5c3-0092fe5f58c4"). InnerVolumeSpecName "kube-api-access-glrdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.354873 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc" (OuterVolumeSpecName: "kube-api-access-zl6xc") pod "8229af75-4935-4593-b482-7c841a710cd9" (UID: "8229af75-4935-4593-b482-7c841a710cd9"). InnerVolumeSpecName "kube-api-access-zl6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.356972 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz" (OuterVolumeSpecName: "kube-api-access-87lrz") pod "c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" (UID: "c1aa8978-06f7-4e6c-9e3b-3edbf20878bd"). InnerVolumeSpecName "kube-api-access-87lrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.365085 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd" (OuterVolumeSpecName: "kube-api-access-gmstd") pod "86009e3d-4e12-4660-8104-b71fade8668f" (UID: "86009e3d-4e12-4660-8104-b71fade8668f"). InnerVolumeSpecName "kube-api-access-gmstd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445099 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8229af75-4935-4593-b482-7c841a710cd9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445579 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934159f-6d4b-4243-a5c3-0092fe5f58c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445590 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl6xc\" (UniqueName: \"kubernetes.io/projected/8229af75-4935-4593-b482-7c841a710cd9-kube-api-access-zl6xc\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445606 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmstd\" (UniqueName: \"kubernetes.io/projected/86009e3d-4e12-4660-8104-b71fade8668f-kube-api-access-gmstd\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445615 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86009e3d-4e12-4660-8104-b71fade8668f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445624 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445633 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrdp\" (UniqueName: \"kubernetes.io/projected/6934159f-6d4b-4243-a5c3-0092fe5f58c4-kube-api-access-glrdp\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.445642 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87lrz\" (UniqueName: \"kubernetes.io/projected/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd-kube-api-access-87lrz\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.539673 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-539b-account-create-update-g5ghg" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.539968 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-539b-account-create-update-g5ghg" event={"ID":"c1aa8978-06f7-4e6c-9e3b-3edbf20878bd","Type":"ContainerDied","Data":"923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.540022 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="923f380c174c7979dae6f383fde43eea40ef05b3df953ddd1ab8d58a4652c24a" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.547853 4731 generic.go:334] "Generic (PLEG): container finished" podID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerID="5fc8cd185db9f6cc74a3980df1d6bcbee6f797152e81faf499063ed07fa23458" exitCode=143 Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.547930 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerDied","Data":"5fc8cd185db9f6cc74a3980df1d6bcbee6f797152e81faf499063ed07fa23458"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.562222 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-439b-account-create-update-28tkb" event={"ID":"ff2992ad-2bee-42b4-8dcc-4584ffa3b336","Type":"ContainerDied","Data":"f25d1e5c95d3bf5696d04afd0d764851e4f8c362c5d6b08fe83193adc6a80494"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.562293 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25d1e5c95d3bf5696d04afd0d764851e4f8c362c5d6b08fe83193adc6a80494" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.562409 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-439b-account-create-update-28tkb" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.570744 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pcjrh" event={"ID":"8229af75-4935-4593-b482-7c841a710cd9","Type":"ContainerDied","Data":"0b54090a7d65d2ea49af1f053c88450ee6f65f08f0d4b78ba3987bf49cbf71fb"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.570791 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b54090a7d65d2ea49af1f053c88450ee6f65f08f0d4b78ba3987bf49cbf71fb" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.570860 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pcjrh" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.582368 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerStarted","Data":"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.585131 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9jrx" event={"ID":"6934159f-6d4b-4243-a5c3-0092fe5f58c4","Type":"ContainerDied","Data":"e478f5a515cd80078799409c348b3e66dd775798a7b544b0747218422b21a14a"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.585182 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e478f5a515cd80078799409c348b3e66dd775798a7b544b0747218422b21a14a" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.585242 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9jrx" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.623452 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xtlvb" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.623526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xtlvb" event={"ID":"86009e3d-4e12-4660-8104-b71fade8668f","Type":"ContainerDied","Data":"c1d1d42c7cb5fd620947256c1e873883c8b3e358e2c7b1968f875cb0a2f149b1"} Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.623564 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d1d42c7cb5fd620947256c1e873883c8b3e358e2c7b1968f875cb0a2f149b1" Dec 03 19:13:56 crc kubenswrapper[4731]: I1203 19:13:56.925552 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.133751 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.282590 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts\") pod \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.283407 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhbks\" (UniqueName: \"kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks\") pod \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\" (UID: \"d64c6991-7b2b-4432-8e99-ac9d1989b0ec\") " Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.285562 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d64c6991-7b2b-4432-8e99-ac9d1989b0ec" (UID: "d64c6991-7b2b-4432-8e99-ac9d1989b0ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.290217 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks" (OuterVolumeSpecName: "kube-api-access-qhbks") pod "d64c6991-7b2b-4432-8e99-ac9d1989b0ec" (UID: "d64c6991-7b2b-4432-8e99-ac9d1989b0ec"). InnerVolumeSpecName "kube-api-access-qhbks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.386628 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhbks\" (UniqueName: \"kubernetes.io/projected/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-kube-api-access-qhbks\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.386918 4731 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d64c6991-7b2b-4432-8e99-ac9d1989b0ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.635180 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.637672 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c53f-account-create-update-vbgt5" event={"ID":"d64c6991-7b2b-4432-8e99-ac9d1989b0ec","Type":"ContainerDied","Data":"d6c6ef47c8baa7f963ce983c6b25f67d9bc73761d488229aaa041de2761e193b"} Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.637767 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c6ef47c8baa7f963ce983c6b25f67d9bc73761d488229aaa041de2761e193b" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643282 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-central-agent" containerID="cri-o://017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23" gracePeriod=30 Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643495 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerStarted","Data":"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30"} Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643968 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643585 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="sg-core" containerID="cri-o://19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b" gracePeriod=30 Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643586 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-notification-agent" containerID="cri-o://d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0" gracePeriod=30 Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.643553 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="proxy-httpd" containerID="cri-o://2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30" gracePeriod=30 Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.649332 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00285379-3e6d-4b1c-9de5-d08bacd73c79","Type":"ContainerStarted","Data":"2ebe421042695a53a741324277a50a61f613de94c41efd0b6698d8188a910b5b"} Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.685236 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.081744275 podStartE2EDuration="6.685207361s" podCreationTimestamp="2025-12-03 19:13:51 +0000 UTC" firstStartedPulling="2025-12-03 19:13:52.397636319 +0000 UTC m=+1152.996230783" lastFinishedPulling="2025-12-03 19:13:57.001099395 +0000 UTC m=+1157.599693869" observedRunningTime="2025-12-03 19:13:57.67143984 +0000 UTC m=+1158.270034314" watchObservedRunningTime="2025-12-03 19:13:57.685207361 +0000 UTC m=+1158.283801825" Dec 03 19:13:57 crc kubenswrapper[4731]: I1203 19:13:57.871542 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e399280-7b1a-4c02-9ef6-787d93fe41c5" path="/var/lib/kubelet/pods/2e399280-7b1a-4c02-9ef6-787d93fe41c5/volumes" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.346898 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.413943 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414017 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414049 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5d5f\" (UniqueName: \"kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414076 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414106 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414159 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414243 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.414283 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data\") pod \"12509218-711b-46a4-a560-493ce03af965\" (UID: \"12509218-711b-46a4-a560-493ce03af965\") " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.421577 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs" (OuterVolumeSpecName: "logs") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.421724 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.434821 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.439080 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts" (OuterVolumeSpecName: "scripts") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.439225 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f" (OuterVolumeSpecName: "kube-api-access-q5d5f") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "kube-api-access-q5d5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.482418 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516179 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516219 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516235 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5d5f\" (UniqueName: \"kubernetes.io/projected/12509218-711b-46a4-a560-493ce03af965-kube-api-access-q5d5f\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516269 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516282 4731 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12509218-711b-46a4-a560-493ce03af965-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.516310 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.520971 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.533657 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data" (OuterVolumeSpecName: "config-data") pod "12509218-711b-46a4-a560-493ce03af965" (UID: "12509218-711b-46a4-a560-493ce03af965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.550689 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.617845 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.618368 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.618452 4731 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12509218-711b-46a4-a560-493ce03af965-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.661907 4731 generic.go:334] "Generic (PLEG): container finished" podID="12509218-711b-46a4-a560-493ce03af965" containerID="2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916" exitCode=0 Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.661997 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerDied","Data":"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.662032 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12509218-711b-46a4-a560-493ce03af965","Type":"ContainerDied","Data":"49f664c93260fd8752114706cba86b7a3fed72541df9731bb81d119e0a1bd02e"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.662054 4731 scope.go:117] "RemoveContainer" containerID="2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.662049 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.670696 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00285379-3e6d-4b1c-9de5-d08bacd73c79","Type":"ContainerStarted","Data":"f09fffbb09ca4adc084c033c881ac6fb45b85ec02a77649a5b96fe64957968b2"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.670771 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"00285379-3e6d-4b1c-9de5-d08bacd73c79","Type":"ContainerStarted","Data":"c2d65c4e657df8108d1f63a80458b1f0e34bf24f17d9cf9064bd210e5546c849"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.672112 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677291 4731 generic.go:334] "Generic (PLEG): container finished" podID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerID="2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30" exitCode=0 Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677323 4731 generic.go:334] "Generic (PLEG): container finished" podID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerID="19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b" exitCode=2 Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677331 4731 generic.go:334] "Generic (PLEG): container finished" podID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerID="d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0" exitCode=0 Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677355 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerDied","Data":"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677382 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerDied","Data":"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.677395 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerDied","Data":"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0"} Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.700732 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.700708391 podStartE2EDuration="3.700708391s" podCreationTimestamp="2025-12-03 19:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:13:58.695852729 +0000 UTC m=+1159.294447213" watchObservedRunningTime="2025-12-03 19:13:58.700708391 +0000 UTC m=+1159.299302865" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.733879 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.734549 4731 scope.go:117] "RemoveContainer" containerID="d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.776413 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803329 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803865 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64c6991-7b2b-4432-8e99-ac9d1989b0ec" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803881 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64c6991-7b2b-4432-8e99-ac9d1989b0ec" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803896 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6934159f-6d4b-4243-a5c3-0092fe5f58c4" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803903 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6934159f-6d4b-4243-a5c3-0092fe5f58c4" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803919 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803926 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803944 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-log" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803950 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-log" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803959 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-httpd" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803965 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-httpd" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803977 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8229af75-4935-4593-b482-7c841a710cd9" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803983 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8229af75-4935-4593-b482-7c841a710cd9" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.803991 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2992ad-2bee-42b4-8dcc-4584ffa3b336" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.803997 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2992ad-2bee-42b4-8dcc-4584ffa3b336" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.804006 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86009e3d-4e12-4660-8104-b71fade8668f" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804012 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="86009e3d-4e12-4660-8104-b71fade8668f" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804195 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="86009e3d-4e12-4660-8104-b71fade8668f" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804208 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2992ad-2bee-42b4-8dcc-4584ffa3b336" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804218 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64c6991-7b2b-4432-8e99-ac9d1989b0ec" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804226 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-log" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804237 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" containerName="mariadb-account-create-update" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804246 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6934159f-6d4b-4243-a5c3-0092fe5f58c4" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804272 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="12509218-711b-46a4-a560-493ce03af965" containerName="glance-httpd" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.804282 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8229af75-4935-4593-b482-7c841a710cd9" containerName="mariadb-database-create" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.805409 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.824919 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.826025 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.834702 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.867724 4731 scope.go:117] "RemoveContainer" containerID="2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.871421 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916\": container with ID starting with 2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916 not found: ID does not exist" containerID="2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.871481 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916"} err="failed to get container status \"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916\": rpc error: code = NotFound desc = could not find container \"2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916\": container with ID starting with 2902c2d0606b6590dee89e922a511180fa1a52cd49837eadc1d1127399129916 not found: ID does not exist" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.871516 4731 scope.go:117] "RemoveContainer" containerID="d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d" Dec 03 19:13:58 crc kubenswrapper[4731]: E1203 19:13:58.878571 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d\": container with ID starting with d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d not found: ID does not exist" containerID="d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.878613 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d"} err="failed to get container status \"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d\": rpc error: code = NotFound desc = could not find container \"d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d\": container with ID starting with d75f3c597ed19f72eaa4eca84282a2b32a21ea5f79639d4b269ecc0beb31ca2d not found: ID does not exist" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927042 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-logs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927098 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjrb7\" (UniqueName: \"kubernetes.io/projected/74f83d4f-46da-43e3-9fab-56e0e45dd76d-kube-api-access-xjrb7\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927126 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927155 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-config-data\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927219 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927240 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927278 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-scripts\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:58 crc kubenswrapper[4731]: I1203 19:13:58.927345 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.013099 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.028728 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.029027 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.029618 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-scripts\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030172 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030337 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-logs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030420 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjrb7\" (UniqueName: \"kubernetes.io/projected/74f83d4f-46da-43e3-9fab-56e0e45dd76d-kube-api-access-xjrb7\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030516 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030654 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-config-data\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.030777 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.029549 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.031188 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f83d4f-46da-43e3-9fab-56e0e45dd76d-logs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.035779 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.035983 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-config-data\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.037522 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-scripts\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.043330 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f83d4f-46da-43e3-9fab-56e0e45dd76d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.054031 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cdc55748f-txbzr" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.060661 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjrb7\" (UniqueName: \"kubernetes.io/projected/74f83d4f-46da-43e3-9fab-56e0e45dd76d-kube-api-access-xjrb7\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.071797 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"74f83d4f-46da-43e3-9fab-56e0e45dd76d\") " pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.178548 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 19:13:59 crc kubenswrapper[4731]: E1203 19:13:59.487792 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c7ff16_cffb_40a4_909c_6c4bca6598a3.slice/crio-conmon-f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.695797 4731 generic.go:334] "Generic (PLEG): container finished" podID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerID="f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06" exitCode=0 Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.696091 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerDied","Data":"f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06"} Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.830882 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 19:13:59 crc kubenswrapper[4731]: W1203 19:13:59.833517 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74f83d4f_46da_43e3_9fab_56e0e45dd76d.slice/crio-a9cb8db82692e9dc0e790fdf905f6cb90d9c4fe714feea610f80862c9bcfba0f WatchSource:0}: Error finding container a9cb8db82692e9dc0e790fdf905f6cb90d9c4fe714feea610f80862c9bcfba0f: Status 404 returned error can't find the container with id a9cb8db82692e9dc0e790fdf905f6cb90d9c4fe714feea610f80862c9bcfba0f Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.884828 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12509218-711b-46a4-a560-493ce03af965" path="/var/lib/kubelet/pods/12509218-711b-46a4-a560-493ce03af965/volumes" Dec 03 19:13:59 crc kubenswrapper[4731]: I1203 19:13:59.963989 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.055799 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.056563 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.056694 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nskb2\" (UniqueName: \"kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.056836 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.056962 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.057133 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.057240 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.057365 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts\") pod \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\" (UID: \"13c7ff16-cffb-40a4-909c-6c4bca6598a3\") " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.059328 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs" (OuterVolumeSpecName: "logs") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.059716 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.065998 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2" (OuterVolumeSpecName: "kube-api-access-nskb2") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "kube-api-access-nskb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.070830 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.084485 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts" (OuterVolumeSpecName: "scripts") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.098943 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.122043 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.155021 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data" (OuterVolumeSpecName: "config-data") pod "13c7ff16-cffb-40a4-909c-6c4bca6598a3" (UID: "13c7ff16-cffb-40a4-909c-6c4bca6598a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160710 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160744 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160758 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160768 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nskb2\" (UniqueName: \"kubernetes.io/projected/13c7ff16-cffb-40a4-909c-6c4bca6598a3-kube-api-access-nskb2\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160799 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160812 4731 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160821 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7ff16-cffb-40a4-909c-6c4bca6598a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.160829 4731 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7ff16-cffb-40a4-909c-6c4bca6598a3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.185376 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.263160 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.717504 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13c7ff16-cffb-40a4-909c-6c4bca6598a3","Type":"ContainerDied","Data":"1427351a8bca3db6a24f12bacb21dcbf92cda5bb84e59b0d8d21afc63d686501"} Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.717882 4731 scope.go:117] "RemoveContainer" containerID="f71dbaecdc1921aa124718e4e8b95186e921c901e8d1fc6171662e778a1b2c06" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.717781 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.721713 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74f83d4f-46da-43e3-9fab-56e0e45dd76d","Type":"ContainerStarted","Data":"14b599669d10ceda0000c32ee068389fa8f4360a7a36a5cf89d62bc1d9305ff7"} Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.721757 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74f83d4f-46da-43e3-9fab-56e0e45dd76d","Type":"ContainerStarted","Data":"a9cb8db82692e9dc0e790fdf905f6cb90d9c4fe714feea610f80862c9bcfba0f"} Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.759807 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.770169 4731 scope.go:117] "RemoveContainer" containerID="5fc8cd185db9f6cc74a3980df1d6bcbee6f797152e81faf499063ed07fa23458" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.787952 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.805231 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:14:00 crc kubenswrapper[4731]: E1203 19:14:00.805676 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-log" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.805695 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-log" Dec 03 19:14:00 crc kubenswrapper[4731]: E1203 19:14:00.805735 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-httpd" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.805742 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-httpd" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.805921 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-httpd" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.805951 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" containerName="glance-log" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.807008 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.810880 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.811217 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.832293 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.883711 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-logs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.883777 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.883837 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.884030 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.884102 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.884149 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.884177 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn8xd\" (UniqueName: \"kubernetes.io/projected/336bb4e4-deca-445b-af0f-2df6ea097a14-kube-api-access-cn8xd\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.884270 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986198 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986309 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-logs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986348 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986396 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986437 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986503 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.986525 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn8xd\" (UniqueName: \"kubernetes.io/projected/336bb4e4-deca-445b-af0f-2df6ea097a14-kube-api-access-cn8xd\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.989325 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-logs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.989399 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/336bb4e4-deca-445b-af0f-2df6ea097a14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.989983 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.991827 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.993616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:00 crc kubenswrapper[4731]: I1203 19:14:00.995983 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.012243 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336bb4e4-deca-445b-af0f-2df6ea097a14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.018424 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn8xd\" (UniqueName: \"kubernetes.io/projected/336bb4e4-deca-445b-af0f-2df6ea097a14-kube-api-access-cn8xd\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.037124 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.038353 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"336bb4e4-deca-445b-af0f-2df6ea097a14\") " pod="openstack/glance-default-internal-api-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.080133 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-687f68f6b4-jvzgv" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.136:8443: connect: connection refused" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.080670 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.168213 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.733489 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74f83d4f-46da-43e3-9fab-56e0e45dd76d","Type":"ContainerStarted","Data":"b87e40998bfebcf2751beb653195fc3fe0499f9eb29b5f29f07569065918f8d0"} Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.761850 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.765225 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.765202056 podStartE2EDuration="3.765202056s" podCreationTimestamp="2025-12-03 19:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:01.761878961 +0000 UTC m=+1162.360473445" watchObservedRunningTime="2025-12-03 19:14:01.765202056 +0000 UTC m=+1162.363796520" Dec 03 19:14:01 crc kubenswrapper[4731]: I1203 19:14:01.898903 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c7ff16-cffb-40a4-909c-6c4bca6598a3" path="/var/lib/kubelet/pods/13c7ff16-cffb-40a4-909c-6c4bca6598a3/volumes" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.609132 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nrxcp"] Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.610784 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.616136 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.616486 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.616620 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rsgdq" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.621287 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nrxcp"] Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.662801 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.662923 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrzlt\" (UniqueName: \"kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.663371 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.663395 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.744362 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"336bb4e4-deca-445b-af0f-2df6ea097a14","Type":"ContainerStarted","Data":"f973bf1f2a0884675171be23279cab87c8e9e3bff9e158f88dceda98e931e22b"} Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.744419 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"336bb4e4-deca-445b-af0f-2df6ea097a14","Type":"ContainerStarted","Data":"68750b97df9c4bd2e6adf294d75832246109a774bc3816f53b33c303e9beedba"} Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.765632 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.765724 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.765815 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.765897 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrzlt\" (UniqueName: \"kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.771373 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.772042 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.776796 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.788239 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrzlt\" (UniqueName: \"kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt\") pod \"nova-cell0-conductor-db-sync-nrxcp\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:02 crc kubenswrapper[4731]: I1203 19:14:02.938573 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.397641 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481355 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481421 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481466 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481632 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481664 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481728 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481792 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.481819 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbnwd\" (UniqueName: \"kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd\") pod \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\" (UID: \"37efbdfd-77b5-4d8c-90cc-c8a6786f189a\") " Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.482610 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.483048 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.488788 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd" (OuterVolumeSpecName: "kube-api-access-fbnwd") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "kube-api-access-fbnwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.496744 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts" (OuterVolumeSpecName: "scripts") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.527677 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.584622 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.584658 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.584670 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.584686 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.584696 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbnwd\" (UniqueName: \"kubernetes.io/projected/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-kube-api-access-fbnwd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.609057 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.609200 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.638895 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data" (OuterVolumeSpecName: "config-data") pod "37efbdfd-77b5-4d8c-90cc-c8a6786f189a" (UID: "37efbdfd-77b5-4d8c-90cc-c8a6786f189a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.642165 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nrxcp"] Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.686749 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.687164 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.687177 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37efbdfd-77b5-4d8c-90cc-c8a6786f189a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.755541 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"336bb4e4-deca-445b-af0f-2df6ea097a14","Type":"ContainerStarted","Data":"3df89a7d22b1b1b9852b62bd314c3477b2f1453c80e8900009c127d8e9b1a0bd"} Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.758134 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" event={"ID":"996da76c-8786-41f9-aa84-4722e2755f0e","Type":"ContainerStarted","Data":"c07212019f81426a155d4f2610266478ee6da00c28d9a195344202910c5a72e4"} Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.761681 4731 generic.go:334] "Generic (PLEG): container finished" podID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerID="017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23" exitCode=0 Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.761759 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerDied","Data":"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23"} Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.761894 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37efbdfd-77b5-4d8c-90cc-c8a6786f189a","Type":"ContainerDied","Data":"28a1bebeed207950e7facfb3c3e5754e3b3a9d77da109234eef8127d635481e2"} Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.761964 4731 scope.go:117] "RemoveContainer" containerID="2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.761787 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.789213 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.7891654949999998 podStartE2EDuration="3.789165495s" podCreationTimestamp="2025-12-03 19:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:03.777653615 +0000 UTC m=+1164.376248079" watchObservedRunningTime="2025-12-03 19:14:03.789165495 +0000 UTC m=+1164.387759959" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.800049 4731 scope.go:117] "RemoveContainer" containerID="19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.849972 4731 scope.go:117] "RemoveContainer" containerID="d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.887698 4731 scope.go:117] "RemoveContainer" containerID="017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.910720 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.917072 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.929795 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.931286 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="sg-core" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931307 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="sg-core" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.931321 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-central-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931328 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-central-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.931355 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="proxy-httpd" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931363 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="proxy-httpd" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.931386 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-notification-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931392 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-notification-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931599 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="proxy-httpd" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931615 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="sg-core" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931623 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-notification-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.931629 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" containerName="ceilometer-central-agent" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.936900 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.938887 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.939311 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.941036 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.942126 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.948324 4731 scope.go:117] "RemoveContainer" containerID="2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.949553 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30\": container with ID starting with 2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30 not found: ID does not exist" containerID="2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.949610 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30"} err="failed to get container status \"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30\": rpc error: code = NotFound desc = could not find container \"2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30\": container with ID starting with 2e353589496cc961a3fe039982ca99c06d6d6947f8d3714925ad7742c73d6f30 not found: ID does not exist" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.949633 4731 scope.go:117] "RemoveContainer" containerID="19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.950472 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b\": container with ID starting with 19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b not found: ID does not exist" containerID="19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.950559 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b"} err="failed to get container status \"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b\": rpc error: code = NotFound desc = could not find container \"19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b\": container with ID starting with 19b4df199a70df859127345f3983a0bfe69d7ee5c5b3e2edf7eaa7dba3c3cd1b not found: ID does not exist" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.950678 4731 scope.go:117] "RemoveContainer" containerID="d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.951246 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0\": container with ID starting with d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0 not found: ID does not exist" containerID="d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.951298 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0"} err="failed to get container status \"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0\": rpc error: code = NotFound desc = could not find container \"d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0\": container with ID starting with d9cfde0416bf18ae54f1d2e596eed73f24c77f0fef59508f1570489f4588b8e0 not found: ID does not exist" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.951330 4731 scope.go:117] "RemoveContainer" containerID="017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23" Dec 03 19:14:03 crc kubenswrapper[4731]: E1203 19:14:03.957555 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23\": container with ID starting with 017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23 not found: ID does not exist" containerID="017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.957604 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23"} err="failed to get container status \"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23\": rpc error: code = NotFound desc = could not find container \"017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23\": container with ID starting with 017a5305c49ff68cdb7445d7fe6568d780746a57d564ba0e42f4ce45d8d2ae23 not found: ID does not exist" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993324 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993398 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4pd\" (UniqueName: \"kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993442 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993473 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993507 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993678 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993794 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:03 crc kubenswrapper[4731]: I1203 19:14:03.993837 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096275 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096353 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4pd\" (UniqueName: \"kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096391 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096421 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096447 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096518 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096572 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.096591 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.097088 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.097357 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.101317 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.102466 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.103333 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.105585 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.107225 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.118936 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4pd\" (UniqueName: \"kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd\") pod \"ceilometer-0\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.261762 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.864587 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:04 crc kubenswrapper[4731]: I1203 19:14:04.892908 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:05 crc kubenswrapper[4731]: I1203 19:14:05.811897 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerStarted","Data":"83b39c9abc9d0f7402da9e44dcc48f9063f489eb5229fa8b96a60639655bf2f7"} Dec 03 19:14:05 crc kubenswrapper[4731]: I1203 19:14:05.868704 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37efbdfd-77b5-4d8c-90cc-c8a6786f189a" path="/var/lib/kubelet/pods/37efbdfd-77b5-4d8c-90cc-c8a6786f189a/volumes" Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.838273 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerStarted","Data":"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6"} Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.839280 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerStarted","Data":"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7"} Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.857759 4731 generic.go:334] "Generic (PLEG): container finished" podID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerID="cbf678eabf96eed394c8395472bdf57b49dfe6cd318626870fa3106aed7d89b5" exitCode=137 Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.857820 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerDied","Data":"cbf678eabf96eed394c8395472bdf57b49dfe6cd318626870fa3106aed7d89b5"} Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.857856 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687f68f6b4-jvzgv" event={"ID":"f69c7907-c04f-4b84-9e31-59fca146a62d","Type":"ContainerDied","Data":"434eff705dcb8799587de37641c4bff976c05a97f48dd016c9b71498964c1dd3"} Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.857873 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434eff705dcb8799587de37641c4bff976c05a97f48dd016c9b71498964c1dd3" Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.906642 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.974396 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.974542 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.974575 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdsbx\" (UniqueName: \"kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.975469 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.975510 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.975542 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.975584 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs\") pod \"f69c7907-c04f-4b84-9e31-59fca146a62d\" (UID: \"f69c7907-c04f-4b84-9e31-59fca146a62d\") " Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.976365 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs" (OuterVolumeSpecName: "logs") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.983539 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:06 crc kubenswrapper[4731]: I1203 19:14:06.988415 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx" (OuterVolumeSpecName: "kube-api-access-sdsbx") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "kube-api-access-sdsbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.016481 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data" (OuterVolumeSpecName: "config-data") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.018546 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts" (OuterVolumeSpecName: "scripts") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.046956 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.059748 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f69c7907-c04f-4b84-9e31-59fca146a62d" (UID: "f69c7907-c04f-4b84-9e31-59fca146a62d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078803 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078861 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f69c7907-c04f-4b84-9e31-59fca146a62d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078874 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdsbx\" (UniqueName: \"kubernetes.io/projected/f69c7907-c04f-4b84-9e31-59fca146a62d-kube-api-access-sdsbx\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078887 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078899 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f69c7907-c04f-4b84-9e31-59fca146a62d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078908 4731 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.078921 4731 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f69c7907-c04f-4b84-9e31-59fca146a62d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.873569 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687f68f6b4-jvzgv" Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.873567 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerStarted","Data":"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642"} Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.928504 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:14:07 crc kubenswrapper[4731]: I1203 19:14:07.948341 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-687f68f6b4-jvzgv"] Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.016345 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.178960 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.179031 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.224732 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.242425 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.877669 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" path="/var/lib/kubelet/pods/f69c7907-c04f-4b84-9e31-59fca146a62d/volumes" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.900737 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 19:14:09 crc kubenswrapper[4731]: I1203 19:14:09.901069 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.169687 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.169750 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.216940 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.227195 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.925367 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:11 crc kubenswrapper[4731]: I1203 19:14:11.925837 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:12 crc kubenswrapper[4731]: I1203 19:14:12.378679 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 19:14:12 crc kubenswrapper[4731]: I1203 19:14:12.378814 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:14:12 crc kubenswrapper[4731]: I1203 19:14:12.391845 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 19:14:13 crc kubenswrapper[4731]: I1203 19:14:13.988943 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:14:13 crc kubenswrapper[4731]: I1203 19:14:13.989362 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:14:14 crc kubenswrapper[4731]: I1203 19:14:14.846931 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:14 crc kubenswrapper[4731]: I1203 19:14:14.998056 4731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 19:14:15 crc kubenswrapper[4731]: I1203 19:14:15.000353 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.043665 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerStarted","Data":"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86"} Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.044569 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-central-agent" containerID="cri-o://64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7" gracePeriod=30 Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.044617 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.044734 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-notification-agent" containerID="cri-o://0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6" gracePeriod=30 Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.044720 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="sg-core" containerID="cri-o://6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642" gracePeriod=30 Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.044795 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="proxy-httpd" containerID="cri-o://8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86" gracePeriod=30 Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.050053 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" event={"ID":"996da76c-8786-41f9-aa84-4722e2755f0e","Type":"ContainerStarted","Data":"524660bf6f72d8fcc0b4c01e70ecca6b752e42fcdbf72e4b972d4019db523655"} Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.084145 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.685701143 podStartE2EDuration="15.084121059s" podCreationTimestamp="2025-12-03 19:14:03 +0000 UTC" firstStartedPulling="2025-12-03 19:14:04.89927727 +0000 UTC m=+1165.497871734" lastFinishedPulling="2025-12-03 19:14:17.297697186 +0000 UTC m=+1177.896291650" observedRunningTime="2025-12-03 19:14:18.075125127 +0000 UTC m=+1178.673719611" watchObservedRunningTime="2025-12-03 19:14:18.084121059 +0000 UTC m=+1178.682715523" Dec 03 19:14:18 crc kubenswrapper[4731]: I1203 19:14:18.112864 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" podStartSLOduration=2.458252681 podStartE2EDuration="16.112827218s" podCreationTimestamp="2025-12-03 19:14:02 +0000 UTC" firstStartedPulling="2025-12-03 19:14:03.645128132 +0000 UTC m=+1164.243722596" lastFinishedPulling="2025-12-03 19:14:17.299702669 +0000 UTC m=+1177.898297133" observedRunningTime="2025-12-03 19:14:18.101155123 +0000 UTC m=+1178.699749587" watchObservedRunningTime="2025-12-03 19:14:18.112827218 +0000 UTC m=+1178.711421692" Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074184 4731 generic.go:334] "Generic (PLEG): container finished" podID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerID="8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86" exitCode=0 Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074231 4731 generic.go:334] "Generic (PLEG): container finished" podID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerID="6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642" exitCode=2 Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074242 4731 generic.go:334] "Generic (PLEG): container finished" podID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerID="0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6" exitCode=0 Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074711 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerDied","Data":"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86"} Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074763 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerDied","Data":"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642"} Dec 03 19:14:19 crc kubenswrapper[4731]: I1203 19:14:19.074777 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerDied","Data":"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6"} Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.903358 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.929810 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j4pd\" (UniqueName: \"kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.929911 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.929969 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.930063 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.930101 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.930225 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.930282 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.930347 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd\") pod \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\" (UID: \"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e\") " Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.933268 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.934980 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.964594 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts" (OuterVolumeSpecName: "scripts") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:21 crc kubenswrapper[4731]: I1203 19:14:21.964665 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd" (OuterVolumeSpecName: "kube-api-access-5j4pd") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "kube-api-access-5j4pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.011474 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.013453 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.033825 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.034155 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.034232 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.034308 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.034378 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.034433 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j4pd\" (UniqueName: \"kubernetes.io/projected/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-kube-api-access-5j4pd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.056611 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.097420 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data" (OuterVolumeSpecName: "config-data") pod "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" (UID: "1d31c3cc-6eee-4c88-99ef-3ea0dc86766e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.103556 4731 generic.go:334] "Generic (PLEG): container finished" podID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerID="64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7" exitCode=0 Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.103614 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerDied","Data":"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7"} Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.103679 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d31c3cc-6eee-4c88-99ef-3ea0dc86766e","Type":"ContainerDied","Data":"83b39c9abc9d0f7402da9e44dcc48f9063f489eb5229fa8b96a60639655bf2f7"} Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.103699 4731 scope.go:117] "RemoveContainer" containerID="8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.103704 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.124905 4731 scope.go:117] "RemoveContainer" containerID="6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.137166 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.137306 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.146703 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.160002 4731 scope.go:117] "RemoveContainer" containerID="0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.160970 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178093 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178679 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-notification-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178701 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-notification-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178726 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon-log" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178734 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon-log" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178752 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="proxy-httpd" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178761 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="proxy-httpd" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178784 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="sg-core" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178792 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="sg-core" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178810 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-central-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178818 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-central-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.178840 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.178847 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179080 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon-log" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179101 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-notification-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179120 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="ceilometer-central-agent" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179137 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69c7907-c04f-4b84-9e31-59fca146a62d" containerName="horizon" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179148 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="proxy-httpd" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.179161 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" containerName="sg-core" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.181644 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.185828 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.186099 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.186446 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.191012 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.195125 4731 scope.go:117] "RemoveContainer" containerID="64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.231484 4731 scope.go:117] "RemoveContainer" containerID="8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.232595 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86\": container with ID starting with 8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86 not found: ID does not exist" containerID="8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.232650 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86"} err="failed to get container status \"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86\": rpc error: code = NotFound desc = could not find container \"8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86\": container with ID starting with 8b20c8695ce976c15793be964977b6886ae192823064c670ae86a3a0adee7f86 not found: ID does not exist" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.232687 4731 scope.go:117] "RemoveContainer" containerID="6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.233146 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642\": container with ID starting with 6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642 not found: ID does not exist" containerID="6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.233193 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642"} err="failed to get container status \"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642\": rpc error: code = NotFound desc = could not find container \"6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642\": container with ID starting with 6b8a31ed6211b1c8e73dbb4d8c5a5677514b40392767be8914c3681dd9b15642 not found: ID does not exist" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.233226 4731 scope.go:117] "RemoveContainer" containerID="0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.233723 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6\": container with ID starting with 0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6 not found: ID does not exist" containerID="0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.233753 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6"} err="failed to get container status \"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6\": rpc error: code = NotFound desc = could not find container \"0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6\": container with ID starting with 0cec91a17d3beb7da2d61f5ce191d5132d90d3f0703dcfd00c80a6db56aa2ef6 not found: ID does not exist" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.233768 4731 scope.go:117] "RemoveContainer" containerID="64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7" Dec 03 19:14:22 crc kubenswrapper[4731]: E1203 19:14:22.234127 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7\": container with ID starting with 64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7 not found: ID does not exist" containerID="64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.234149 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7"} err="failed to get container status \"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7\": rpc error: code = NotFound desc = could not find container \"64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7\": container with ID starting with 64dba8f4b957fda48885ddef364e809ce411baddf46614c31424f2057c51d6d7 not found: ID does not exist" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.341607 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.341693 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.341718 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.341881 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.342076 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.342241 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.342474 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.342597 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lgn4\" (UniqueName: \"kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444360 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444474 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444497 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444524 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444574 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444616 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444668 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.444710 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lgn4\" (UniqueName: \"kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.446505 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.446941 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.450918 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.450929 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.450930 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.451467 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.451700 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.467911 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lgn4\" (UniqueName: \"kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4\") pod \"ceilometer-0\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.507819 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:22 crc kubenswrapper[4731]: I1203 19:14:22.962219 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:23 crc kubenswrapper[4731]: I1203 19:14:23.113324 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerStarted","Data":"b38c276bf4c0dffdd43ef2bda2571602b2af75eb66b255f1d5187bab2ad831f0"} Dec 03 19:14:23 crc kubenswrapper[4731]: I1203 19:14:23.870692 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d31c3cc-6eee-4c88-99ef-3ea0dc86766e" path="/var/lib/kubelet/pods/1d31c3cc-6eee-4c88-99ef-3ea0dc86766e/volumes" Dec 03 19:14:24 crc kubenswrapper[4731]: I1203 19:14:24.129891 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerStarted","Data":"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a"} Dec 03 19:14:25 crc kubenswrapper[4731]: I1203 19:14:25.142324 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerStarted","Data":"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68"} Dec 03 19:14:25 crc kubenswrapper[4731]: I1203 19:14:25.143845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerStarted","Data":"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99"} Dec 03 19:14:27 crc kubenswrapper[4731]: I1203 19:14:27.162363 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerStarted","Data":"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd"} Dec 03 19:14:27 crc kubenswrapper[4731]: I1203 19:14:27.164415 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:14:27 crc kubenswrapper[4731]: I1203 19:14:27.189613 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5662019969999998 podStartE2EDuration="5.189593953s" podCreationTimestamp="2025-12-03 19:14:22 +0000 UTC" firstStartedPulling="2025-12-03 19:14:22.967900389 +0000 UTC m=+1183.566494853" lastFinishedPulling="2025-12-03 19:14:26.591292345 +0000 UTC m=+1187.189886809" observedRunningTime="2025-12-03 19:14:27.187509938 +0000 UTC m=+1187.786104402" watchObservedRunningTime="2025-12-03 19:14:27.189593953 +0000 UTC m=+1187.788188417" Dec 03 19:14:28 crc kubenswrapper[4731]: I1203 19:14:28.172342 4731 generic.go:334] "Generic (PLEG): container finished" podID="996da76c-8786-41f9-aa84-4722e2755f0e" containerID="524660bf6f72d8fcc0b4c01e70ecca6b752e42fcdbf72e4b972d4019db523655" exitCode=0 Dec 03 19:14:28 crc kubenswrapper[4731]: I1203 19:14:28.172426 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" event={"ID":"996da76c-8786-41f9-aa84-4722e2755f0e","Type":"ContainerDied","Data":"524660bf6f72d8fcc0b4c01e70ecca6b752e42fcdbf72e4b972d4019db523655"} Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.588430 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.698577 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle\") pod \"996da76c-8786-41f9-aa84-4722e2755f0e\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.698856 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts\") pod \"996da76c-8786-41f9-aa84-4722e2755f0e\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.698892 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrzlt\" (UniqueName: \"kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt\") pod \"996da76c-8786-41f9-aa84-4722e2755f0e\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.698940 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data\") pod \"996da76c-8786-41f9-aa84-4722e2755f0e\" (UID: \"996da76c-8786-41f9-aa84-4722e2755f0e\") " Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.707148 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt" (OuterVolumeSpecName: "kube-api-access-rrzlt") pod "996da76c-8786-41f9-aa84-4722e2755f0e" (UID: "996da76c-8786-41f9-aa84-4722e2755f0e"). InnerVolumeSpecName "kube-api-access-rrzlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.707347 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts" (OuterVolumeSpecName: "scripts") pod "996da76c-8786-41f9-aa84-4722e2755f0e" (UID: "996da76c-8786-41f9-aa84-4722e2755f0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.744467 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data" (OuterVolumeSpecName: "config-data") pod "996da76c-8786-41f9-aa84-4722e2755f0e" (UID: "996da76c-8786-41f9-aa84-4722e2755f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.748798 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "996da76c-8786-41f9-aa84-4722e2755f0e" (UID: "996da76c-8786-41f9-aa84-4722e2755f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.801468 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.801516 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrzlt\" (UniqueName: \"kubernetes.io/projected/996da76c-8786-41f9-aa84-4722e2755f0e-kube-api-access-rrzlt\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.801530 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:29 crc kubenswrapper[4731]: I1203 19:14:29.801539 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/996da76c-8786-41f9-aa84-4722e2755f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.214862 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" event={"ID":"996da76c-8786-41f9-aa84-4722e2755f0e","Type":"ContainerDied","Data":"c07212019f81426a155d4f2610266478ee6da00c28d9a195344202910c5a72e4"} Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.215466 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c07212019f81426a155d4f2610266478ee6da00c28d9a195344202910c5a72e4" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.215062 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nrxcp" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.295039 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:30 crc kubenswrapper[4731]: E1203 19:14:30.295866 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996da76c-8786-41f9-aa84-4722e2755f0e" containerName="nova-cell0-conductor-db-sync" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.295892 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="996da76c-8786-41f9-aa84-4722e2755f0e" containerName="nova-cell0-conductor-db-sync" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.296298 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="996da76c-8786-41f9-aa84-4722e2755f0e" containerName="nova-cell0-conductor-db-sync" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.297349 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.300239 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rsgdq" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.300526 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.319476 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.413926 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.414088 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcx8x\" (UniqueName: \"kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.414121 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.448521 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:30 crc kubenswrapper[4731]: E1203 19:14:30.449482 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-zcx8x], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell0-conductor-0" podUID="70a05d06-8ad5-4121-97e4-36f9309afd59" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.515388 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcx8x\" (UniqueName: \"kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.515440 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.515549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.521173 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.522098 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:30 crc kubenswrapper[4731]: I1203 19:14:30.539240 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcx8x\" (UniqueName: \"kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x\") pod \"nova-cell0-conductor-0\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.223870 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.239876 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.333119 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcx8x\" (UniqueName: \"kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x\") pod \"70a05d06-8ad5-4121-97e4-36f9309afd59\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.333233 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle\") pod \"70a05d06-8ad5-4121-97e4-36f9309afd59\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.333392 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data\") pod \"70a05d06-8ad5-4121-97e4-36f9309afd59\" (UID: \"70a05d06-8ad5-4121-97e4-36f9309afd59\") " Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.339318 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data" (OuterVolumeSpecName: "config-data") pod "70a05d06-8ad5-4121-97e4-36f9309afd59" (UID: "70a05d06-8ad5-4121-97e4-36f9309afd59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.339942 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x" (OuterVolumeSpecName: "kube-api-access-zcx8x") pod "70a05d06-8ad5-4121-97e4-36f9309afd59" (UID: "70a05d06-8ad5-4121-97e4-36f9309afd59"). InnerVolumeSpecName "kube-api-access-zcx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.341362 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70a05d06-8ad5-4121-97e4-36f9309afd59" (UID: "70a05d06-8ad5-4121-97e4-36f9309afd59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.436779 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcx8x\" (UniqueName: \"kubernetes.io/projected/70a05d06-8ad5-4121-97e4-36f9309afd59-kube-api-access-zcx8x\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.437127 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:31 crc kubenswrapper[4731]: I1203 19:14:31.437247 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a05d06-8ad5-4121-97e4-36f9309afd59-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.233270 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.298803 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.322400 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.333402 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.336551 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-central-agent" containerID="cri-o://2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a" gracePeriod=30 Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.337143 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="proxy-httpd" containerID="cri-o://6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd" gracePeriod=30 Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.337214 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="sg-core" containerID="cri-o://a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68" gracePeriod=30 Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.337273 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-notification-agent" containerID="cri-o://77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99" gracePeriod=30 Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.346413 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.347909 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.350081 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rsgdq" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.350285 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.388526 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.478645 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9sc\" (UniqueName: \"kubernetes.io/projected/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-kube-api-access-hd9sc\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.479345 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.479463 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.581739 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9sc\" (UniqueName: \"kubernetes.io/projected/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-kube-api-access-hd9sc\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.582043 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.582154 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.588016 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.594947 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.599987 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9sc\" (UniqueName: \"kubernetes.io/projected/0080e314-2b1a-4194-b82c-84d5b3d4f1a8-kube-api-access-hd9sc\") pod \"nova-cell0-conductor-0\" (UID: \"0080e314-2b1a-4194-b82c-84d5b3d4f1a8\") " pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:32 crc kubenswrapper[4731]: I1203 19:14:32.676137 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.153023 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.253430 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0080e314-2b1a-4194-b82c-84d5b3d4f1a8","Type":"ContainerStarted","Data":"668ba251e3c443dc633e0e8427b999638fc6a0506a91968818c289d499578a14"} Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257521 4731 generic.go:334] "Generic (PLEG): container finished" podID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerID="6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd" exitCode=0 Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257561 4731 generic.go:334] "Generic (PLEG): container finished" podID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerID="a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68" exitCode=2 Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257571 4731 generic.go:334] "Generic (PLEG): container finished" podID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerID="77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99" exitCode=0 Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257598 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerDied","Data":"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd"} Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257653 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerDied","Data":"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68"} Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.257668 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerDied","Data":"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99"} Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.858477 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:33 crc kubenswrapper[4731]: I1203 19:14:33.867173 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a05d06-8ad5-4121-97e4-36f9309afd59" path="/var/lib/kubelet/pods/70a05d06-8ad5-4121-97e4-36f9309afd59/volumes" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.009812 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010240 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010328 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010377 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lgn4\" (UniqueName: \"kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010441 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010468 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010483 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010522 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd\") pod \"844b08b1-ebeb-413b-819c-565bf2b0fd30\" (UID: \"844b08b1-ebeb-413b-819c-565bf2b0fd30\") " Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010897 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.010996 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.016226 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4" (OuterVolumeSpecName: "kube-api-access-6lgn4") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "kube-api-access-6lgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.019120 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts" (OuterVolumeSpecName: "scripts") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.049034 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.064170 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.095037 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.106037 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data" (OuterVolumeSpecName: "config-data") pod "844b08b1-ebeb-413b-819c-565bf2b0fd30" (UID: "844b08b1-ebeb-413b-819c-565bf2b0fd30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112489 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112547 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112559 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112570 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112579 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112588 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/844b08b1-ebeb-413b-819c-565bf2b0fd30-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112597 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844b08b1-ebeb-413b-819c-565bf2b0fd30-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.112606 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lgn4\" (UniqueName: \"kubernetes.io/projected/844b08b1-ebeb-413b-819c-565bf2b0fd30-kube-api-access-6lgn4\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.271534 4731 generic.go:334] "Generic (PLEG): container finished" podID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerID="2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a" exitCode=0 Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.271596 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.271604 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerDied","Data":"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a"} Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.271634 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"844b08b1-ebeb-413b-819c-565bf2b0fd30","Type":"ContainerDied","Data":"b38c276bf4c0dffdd43ef2bda2571602b2af75eb66b255f1d5187bab2ad831f0"} Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.271652 4731 scope.go:117] "RemoveContainer" containerID="6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.274125 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0080e314-2b1a-4194-b82c-84d5b3d4f1a8","Type":"ContainerStarted","Data":"48b4656a0e2bea7e2938b16110b25e99c661bfb9714da908b482d158686dc2d2"} Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.274701 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.299582 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.299551158 podStartE2EDuration="2.299551158s" podCreationTimestamp="2025-12-03 19:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:34.290215856 +0000 UTC m=+1194.888810330" watchObservedRunningTime="2025-12-03 19:14:34.299551158 +0000 UTC m=+1194.898145622" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.312429 4731 scope.go:117] "RemoveContainer" containerID="a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.328042 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.335387 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.346406 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.346904 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="proxy-httpd" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.346923 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="proxy-httpd" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.346942 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-central-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.346949 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-central-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.346966 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="sg-core" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.346973 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="sg-core" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.347007 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-notification-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.347014 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-notification-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.347194 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-notification-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.347214 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="proxy-httpd" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.347225 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="sg-core" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.347235 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" containerName="ceilometer-central-agent" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.349426 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.351832 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.352376 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.353111 4731 scope.go:117] "RemoveContainer" containerID="77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.353427 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.358840 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.393058 4731 scope.go:117] "RemoveContainer" containerID="2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.414365 4731 scope.go:117] "RemoveContainer" containerID="6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.414894 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd\": container with ID starting with 6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd not found: ID does not exist" containerID="6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.414962 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd"} err="failed to get container status \"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd\": rpc error: code = NotFound desc = could not find container \"6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd\": container with ID starting with 6c4843e9e616eec5cb8a9b48d111c23ec459546724ba06a6b6344208be3f19bd not found: ID does not exist" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.415014 4731 scope.go:117] "RemoveContainer" containerID="a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.415402 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68\": container with ID starting with a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68 not found: ID does not exist" containerID="a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.415439 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68"} err="failed to get container status \"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68\": rpc error: code = NotFound desc = could not find container \"a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68\": container with ID starting with a64066c373bb615cfa6c440784228fdaf0e3622c67c11cf2c7efbfdffe045a68 not found: ID does not exist" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.415466 4731 scope.go:117] "RemoveContainer" containerID="77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.416445 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99\": container with ID starting with 77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99 not found: ID does not exist" containerID="77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.416471 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99"} err="failed to get container status \"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99\": rpc error: code = NotFound desc = could not find container \"77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99\": container with ID starting with 77646fa931ffaa455a8829d7d19e7c9f1c553a0c4e6b9862cbddebc5dcf3cc99 not found: ID does not exist" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.416487 4731 scope.go:117] "RemoveContainer" containerID="2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a" Dec 03 19:14:34 crc kubenswrapper[4731]: E1203 19:14:34.416723 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a\": container with ID starting with 2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a not found: ID does not exist" containerID="2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.416753 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a"} err="failed to get container status \"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a\": rpc error: code = NotFound desc = could not find container \"2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a\": container with ID starting with 2fe46d832fdc8486ddcb0ed7b01a3e2971cd5b674e7b689eb9b5b13100bf7e8a not found: ID does not exist" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417174 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417507 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417549 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417595 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417632 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417711 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417791 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.417962 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49jw\" (UniqueName: \"kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.519897 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49jw\" (UniqueName: \"kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520030 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520280 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520338 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520404 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520454 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520541 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520606 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.520616 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.521678 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.525236 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.526235 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.526277 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.526685 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.527880 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.541227 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49jw\" (UniqueName: \"kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw\") pod \"ceilometer-0\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " pod="openstack/ceilometer-0" Dec 03 19:14:34 crc kubenswrapper[4731]: I1203 19:14:34.697550 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:14:35 crc kubenswrapper[4731]: I1203 19:14:35.188235 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:14:35 crc kubenswrapper[4731]: I1203 19:14:35.287007 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerStarted","Data":"7d2eb7f22d213a21f5080fb056c75f653081f08ddd3708447469138e56620ea6"} Dec 03 19:14:35 crc kubenswrapper[4731]: I1203 19:14:35.895784 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844b08b1-ebeb-413b-819c-565bf2b0fd30" path="/var/lib/kubelet/pods/844b08b1-ebeb-413b-819c-565bf2b0fd30/volumes" Dec 03 19:14:36 crc kubenswrapper[4731]: I1203 19:14:36.306650 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerStarted","Data":"45621dbe8ef595d6d18625ea022f2bb8ac1d6970a725152259d0be783ce712e6"} Dec 03 19:14:37 crc kubenswrapper[4731]: I1203 19:14:37.323724 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerStarted","Data":"2687bbaf72399581f9154a0cf2f2ef71f6f5de7bc3d48508e68558b27a00c9bd"} Dec 03 19:14:37 crc kubenswrapper[4731]: I1203 19:14:37.324564 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerStarted","Data":"b13891c28077988b9b36e217a84ae66e9fd568464c62b4dcf4d5217a02b52ada"} Dec 03 19:14:39 crc kubenswrapper[4731]: I1203 19:14:39.350665 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerStarted","Data":"c5233383df8d91e54cf1617280fa040fb23c8050974bf379848173c4796f6fd0"} Dec 03 19:14:39 crc kubenswrapper[4731]: I1203 19:14:39.351543 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:14:39 crc kubenswrapper[4731]: I1203 19:14:39.401778 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.935035126 podStartE2EDuration="5.401743773s" podCreationTimestamp="2025-12-03 19:14:34 +0000 UTC" firstStartedPulling="2025-12-03 19:14:35.19668129 +0000 UTC m=+1195.795275774" lastFinishedPulling="2025-12-03 19:14:38.663389957 +0000 UTC m=+1199.261984421" observedRunningTime="2025-12-03 19:14:39.376292885 +0000 UTC m=+1199.974887369" watchObservedRunningTime="2025-12-03 19:14:39.401743773 +0000 UTC m=+1200.000338277" Dec 03 19:14:42 crc kubenswrapper[4731]: I1203 19:14:42.711067 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.450495 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5fcx2"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.452590 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.455599 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.455675 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.471299 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fcx2"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.559351 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.559473 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.559517 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgx86\" (UniqueName: \"kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.559575 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.662267 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.662642 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.662774 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.662874 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgx86\" (UniqueName: \"kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.671121 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.671270 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.690033 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.699425 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.704448 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgx86\" (UniqueName: \"kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86\") pod \"nova-cell0-cell-mapping-5fcx2\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.707051 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.713021 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.716775 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.776516 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.866819 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzf4z\" (UniqueName: \"kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.866898 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.866929 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.866983 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.918600 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.919778 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.919882 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.938275 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.952412 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.961749 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.965583 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.969100 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzf4z\" (UniqueName: \"kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.969204 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.969288 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.969384 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.969985 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.977343 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.978994 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.980390 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:43 crc kubenswrapper[4731]: I1203 19:14:43.987761 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.001437 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.021346 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.022175 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzf4z\" (UniqueName: \"kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z\") pod \"nova-metadata-0\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " pod="openstack/nova-metadata-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.064903 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093114 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093236 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6wkm\" (UniqueName: \"kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093312 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093350 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093404 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093442 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093519 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093541 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093577 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8h9z\" (UniqueName: \"kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.093597 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4t8l\" (UniqueName: \"kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.197677 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198223 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198291 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198339 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198357 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198391 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8h9z\" (UniqueName: \"kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198414 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4t8l\" (UniqueName: \"kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198480 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198534 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6wkm\" (UniqueName: \"kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.198564 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.200000 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.202039 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.204046 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.205623 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.206931 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.210830 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.215874 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.237538 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.242882 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8h9z\" (UniqueName: \"kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z\") pod \"nova-api-0\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.250222 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4t8l\" (UniqueName: \"kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l\") pod \"nova-scheduler-0\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.261561 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6wkm\" (UniqueName: \"kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm\") pod \"nova-cell1-novncproxy-0\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.290209 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.402225 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.408283 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.571238 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fcx2"] Dec 03 19:14:44 crc kubenswrapper[4731]: W1203 19:14:44.574797 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b0a2f42_1aee_41ee_87ce_ddd561f3c7a0.slice/crio-8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956 WatchSource:0}: Error finding container 8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956: Status 404 returned error can't find the container with id 8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956 Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.840437 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rbmpj"] Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.842144 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.844664 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.845062 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.900489 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rbmpj"] Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.922855 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:14:44 crc kubenswrapper[4731]: I1203 19:14:44.925208 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.054986 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.055073 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.055097 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.055162 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgh9p\" (UniqueName: \"kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.123659 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.156980 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh9p\" (UniqueName: \"kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.158114 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.158215 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.158269 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.172068 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.179450 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.181508 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.192151 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgh9p\" (UniqueName: \"kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p\") pod \"nova-cell1-conductor-db-sync-rbmpj\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.193802 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.241833 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.303219 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.466298 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerStarted","Data":"2193ad7b91090e49f296851a0a61a9e88dd05f3cbfab658add4fa357efb518ae"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.472795 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bef1af4-84af-4242-b22e-5673b6fc209a","Type":"ContainerStarted","Data":"a31f250ceec446c2b72ec64444ac94415e795fdca8871e1b07a24f73c96a91f6"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.474840 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerStarted","Data":"d6cb23a8bbddf38275c3e1f0d9efdad509d6e32171c5bd08080e85d1ea48838f"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.499002 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dde31fb3-591b-4f8c-af33-df37bb71c3a6","Type":"ContainerStarted","Data":"623063e970510c484070e045b878d0728871ee2b436c4b2cf8229891caea7581"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.506547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fcx2" event={"ID":"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0","Type":"ContainerStarted","Data":"76137f6787685f877f7650d2c7f7cc5b85e27124453be78f4ae6258123f56ba9"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.506615 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fcx2" event={"ID":"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0","Type":"ContainerStarted","Data":"8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956"} Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.586518 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5fcx2" podStartSLOduration=2.586490558 podStartE2EDuration="2.586490558s" podCreationTimestamp="2025-12-03 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:45.57699852 +0000 UTC m=+1206.175592984" watchObservedRunningTime="2025-12-03 19:14:45.586490558 +0000 UTC m=+1206.185085022" Dec 03 19:14:45 crc kubenswrapper[4731]: I1203 19:14:45.690170 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rbmpj"] Dec 03 19:14:45 crc kubenswrapper[4731]: W1203 19:14:45.708894 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676bb34b_7c07_46fb_bf1b_62e21e5293f8.slice/crio-3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70 WatchSource:0}: Error finding container 3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70: Status 404 returned error can't find the container with id 3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70 Dec 03 19:14:46 crc kubenswrapper[4731]: I1203 19:14:46.521335 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" event={"ID":"676bb34b-7c07-46fb-bf1b-62e21e5293f8","Type":"ContainerStarted","Data":"67998100f37b7a848af7004d44074badb97c0e3bd8c716933a0f6321765c73ef"} Dec 03 19:14:46 crc kubenswrapper[4731]: I1203 19:14:46.521808 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" event={"ID":"676bb34b-7c07-46fb-bf1b-62e21e5293f8","Type":"ContainerStarted","Data":"3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70"} Dec 03 19:14:47 crc kubenswrapper[4731]: I1203 19:14:47.648093 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" podStartSLOduration=3.648062535 podStartE2EDuration="3.648062535s" podCreationTimestamp="2025-12-03 19:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:46.546595612 +0000 UTC m=+1207.145190076" watchObservedRunningTime="2025-12-03 19:14:47.648062535 +0000 UTC m=+1208.246656999" Dec 03 19:14:47 crc kubenswrapper[4731]: I1203 19:14:47.648811 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:47 crc kubenswrapper[4731]: I1203 19:14:47.668453 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.558928 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerStarted","Data":"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe"} Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.562697 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bef1af4-84af-4242-b22e-5673b6fc209a","Type":"ContainerStarted","Data":"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8"} Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.562906 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9bef1af4-84af-4242-b22e-5673b6fc209a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8" gracePeriod=30 Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.564861 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerStarted","Data":"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be"} Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.568470 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dde31fb3-591b-4f8c-af33-df37bb71c3a6","Type":"ContainerStarted","Data":"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812"} Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.589118 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.904003688 podStartE2EDuration="6.589093166s" podCreationTimestamp="2025-12-03 19:14:43 +0000 UTC" firstStartedPulling="2025-12-03 19:14:45.225440135 +0000 UTC m=+1205.824034599" lastFinishedPulling="2025-12-03 19:14:48.910529613 +0000 UTC m=+1209.509124077" observedRunningTime="2025-12-03 19:14:49.581365134 +0000 UTC m=+1210.179959598" watchObservedRunningTime="2025-12-03 19:14:49.589093166 +0000 UTC m=+1210.187687630" Dec 03 19:14:49 crc kubenswrapper[4731]: I1203 19:14:49.612499 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.818699464 podStartE2EDuration="6.612477759s" podCreationTimestamp="2025-12-03 19:14:43 +0000 UTC" firstStartedPulling="2025-12-03 19:14:45.117503802 +0000 UTC m=+1205.716098256" lastFinishedPulling="2025-12-03 19:14:48.911282087 +0000 UTC m=+1209.509876551" observedRunningTime="2025-12-03 19:14:49.608196404 +0000 UTC m=+1210.206790888" watchObservedRunningTime="2025-12-03 19:14:49.612477759 +0000 UTC m=+1210.211072233" Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.579355 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerStarted","Data":"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba"} Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.584011 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerStarted","Data":"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1"} Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.584316 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-log" containerID="cri-o://dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" gracePeriod=30 Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.584412 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-metadata" containerID="cri-o://c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" gracePeriod=30 Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.621468 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.03287816 podStartE2EDuration="7.621439424s" podCreationTimestamp="2025-12-03 19:14:43 +0000 UTC" firstStartedPulling="2025-12-03 19:14:45.327963507 +0000 UTC m=+1205.926557971" lastFinishedPulling="2025-12-03 19:14:48.916524771 +0000 UTC m=+1209.515119235" observedRunningTime="2025-12-03 19:14:50.615814317 +0000 UTC m=+1211.214408791" watchObservedRunningTime="2025-12-03 19:14:50.621439424 +0000 UTC m=+1211.220033898" Dec 03 19:14:50 crc kubenswrapper[4731]: I1203 19:14:50.656602 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.663517486 podStartE2EDuration="7.656566005s" podCreationTimestamp="2025-12-03 19:14:43 +0000 UTC" firstStartedPulling="2025-12-03 19:14:44.91924872 +0000 UTC m=+1205.517843184" lastFinishedPulling="2025-12-03 19:14:48.912297239 +0000 UTC m=+1209.510891703" observedRunningTime="2025-12-03 19:14:50.632517631 +0000 UTC m=+1211.231112095" watchObservedRunningTime="2025-12-03 19:14:50.656566005 +0000 UTC m=+1211.255160479" Dec 03 19:14:51 crc kubenswrapper[4731]: E1203 19:14:51.057332 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14da4940_c4b9_41e9_bd05_d84ef5d63c03.slice/crio-conmon-c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.235619 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.299873 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzf4z\" (UniqueName: \"kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z\") pod \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.300074 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data\") pod \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.300191 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle\") pod \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.300224 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs\") pod \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\" (UID: \"14da4940-c4b9-41e9-bd05-d84ef5d63c03\") " Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.300938 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs" (OuterVolumeSpecName: "logs") pod "14da4940-c4b9-41e9-bd05-d84ef5d63c03" (UID: "14da4940-c4b9-41e9-bd05-d84ef5d63c03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.314310 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z" (OuterVolumeSpecName: "kube-api-access-fzf4z") pod "14da4940-c4b9-41e9-bd05-d84ef5d63c03" (UID: "14da4940-c4b9-41e9-bd05-d84ef5d63c03"). InnerVolumeSpecName "kube-api-access-fzf4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.335079 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14da4940-c4b9-41e9-bd05-d84ef5d63c03" (UID: "14da4940-c4b9-41e9-bd05-d84ef5d63c03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.335181 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data" (OuterVolumeSpecName: "config-data") pod "14da4940-c4b9-41e9-bd05-d84ef5d63c03" (UID: "14da4940-c4b9-41e9-bd05-d84ef5d63c03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.403375 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.403425 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14da4940-c4b9-41e9-bd05-d84ef5d63c03-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.403480 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzf4z\" (UniqueName: \"kubernetes.io/projected/14da4940-c4b9-41e9-bd05-d84ef5d63c03-kube-api-access-fzf4z\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.403497 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14da4940-c4b9-41e9-bd05-d84ef5d63c03-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.596783 4731 generic.go:334] "Generic (PLEG): container finished" podID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerID="c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" exitCode=0 Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.597215 4731 generic.go:334] "Generic (PLEG): container finished" podID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerID="dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" exitCode=143 Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.596993 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.597031 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerDied","Data":"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1"} Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.599708 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerDied","Data":"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be"} Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.599735 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14da4940-c4b9-41e9-bd05-d84ef5d63c03","Type":"ContainerDied","Data":"d6cb23a8bbddf38275c3e1f0d9efdad509d6e32171c5bd08080e85d1ea48838f"} Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.599754 4731 scope.go:117] "RemoveContainer" containerID="c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.635234 4731 scope.go:117] "RemoveContainer" containerID="dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.647807 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.658826 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.677563 4731 scope.go:117] "RemoveContainer" containerID="c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" Dec 03 19:14:51 crc kubenswrapper[4731]: E1203 19:14:51.679931 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1\": container with ID starting with c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1 not found: ID does not exist" containerID="c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.679976 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1"} err="failed to get container status \"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1\": rpc error: code = NotFound desc = could not find container \"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1\": container with ID starting with c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1 not found: ID does not exist" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.680008 4731 scope.go:117] "RemoveContainer" containerID="dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" Dec 03 19:14:51 crc kubenswrapper[4731]: E1203 19:14:51.687580 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be\": container with ID starting with dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be not found: ID does not exist" containerID="dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.687651 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be"} err="failed to get container status \"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be\": rpc error: code = NotFound desc = could not find container \"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be\": container with ID starting with dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be not found: ID does not exist" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.687690 4731 scope.go:117] "RemoveContainer" containerID="c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.693412 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1"} err="failed to get container status \"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1\": rpc error: code = NotFound desc = could not find container \"c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1\": container with ID starting with c7c5b0c39ed4baa05788fd4bec3b4c2edb566411e72a7d565009f7a1f7e9fcf1 not found: ID does not exist" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.693476 4731 scope.go:117] "RemoveContainer" containerID="dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.694639 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be"} err="failed to get container status \"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be\": rpc error: code = NotFound desc = could not find container \"dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be\": container with ID starting with dbcf17d6c389dc2987edb487dc3490ac00209664da2c9596f8e4de69ce2fe8be not found: ID does not exist" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.696940 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:51 crc kubenswrapper[4731]: E1203 19:14:51.697565 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-log" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.697592 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-log" Dec 03 19:14:51 crc kubenswrapper[4731]: E1203 19:14:51.697608 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-metadata" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.697617 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-metadata" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.698061 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-metadata" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.698106 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" containerName="nova-metadata-log" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.699950 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.704621 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.704936 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.711434 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.711858 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.711942 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.711972 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcj7j\" (UniqueName: \"kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.712659 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.712755 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815151 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815204 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcj7j\" (UniqueName: \"kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815291 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815348 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815395 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.815934 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.822053 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.824794 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.826317 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.837377 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcj7j\" (UniqueName: \"kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j\") pod \"nova-metadata-0\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " pod="openstack/nova-metadata-0" Dec 03 19:14:51 crc kubenswrapper[4731]: I1203 19:14:51.870793 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14da4940-c4b9-41e9-bd05-d84ef5d63c03" path="/var/lib/kubelet/pods/14da4940-c4b9-41e9-bd05-d84ef5d63c03/volumes" Dec 03 19:14:52 crc kubenswrapper[4731]: I1203 19:14:52.026983 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:52 crc kubenswrapper[4731]: I1203 19:14:52.621851 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:53 crc kubenswrapper[4731]: I1203 19:14:53.628135 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerStarted","Data":"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c"} Dec 03 19:14:53 crc kubenswrapper[4731]: I1203 19:14:53.628908 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerStarted","Data":"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697"} Dec 03 19:14:53 crc kubenswrapper[4731]: I1203 19:14:53.628924 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerStarted","Data":"81b1ce3481029135477a2941279a12941d097158fb51f3b2ca398aff27289cc6"} Dec 03 19:14:53 crc kubenswrapper[4731]: I1203 19:14:53.652046 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.652026856 podStartE2EDuration="2.652026856s" podCreationTimestamp="2025-12-03 19:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:53.64834854 +0000 UTC m=+1214.246943004" watchObservedRunningTime="2025-12-03 19:14:53.652026856 +0000 UTC m=+1214.250621320" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.290942 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.291329 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.331567 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.403625 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.403749 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.408785 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.656121 4731 generic.go:334] "Generic (PLEG): container finished" podID="676bb34b-7c07-46fb-bf1b-62e21e5293f8" containerID="67998100f37b7a848af7004d44074badb97c0e3bd8c716933a0f6321765c73ef" exitCode=0 Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.656489 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" event={"ID":"676bb34b-7c07-46fb-bf1b-62e21e5293f8","Type":"ContainerDied","Data":"67998100f37b7a848af7004d44074badb97c0e3bd8c716933a0f6321765c73ef"} Dec 03 19:14:54 crc kubenswrapper[4731]: I1203 19:14:54.705486 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 19:14:55 crc kubenswrapper[4731]: I1203 19:14:55.485478 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 19:14:55 crc kubenswrapper[4731]: I1203 19:14:55.485522 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 19:14:55 crc kubenswrapper[4731]: I1203 19:14:55.672007 4731 generic.go:334] "Generic (PLEG): container finished" podID="7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" containerID="76137f6787685f877f7650d2c7f7cc5b85e27124453be78f4ae6258123f56ba9" exitCode=0 Dec 03 19:14:55 crc kubenswrapper[4731]: I1203 19:14:55.672385 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fcx2" event={"ID":"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0","Type":"ContainerDied","Data":"76137f6787685f877f7650d2c7f7cc5b85e27124453be78f4ae6258123f56ba9"} Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.078301 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.137840 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle\") pod \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.138078 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data\") pod \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.138244 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts\") pod \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.138702 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgh9p\" (UniqueName: \"kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p\") pod \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\" (UID: \"676bb34b-7c07-46fb-bf1b-62e21e5293f8\") " Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.147011 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p" (OuterVolumeSpecName: "kube-api-access-sgh9p") pod "676bb34b-7c07-46fb-bf1b-62e21e5293f8" (UID: "676bb34b-7c07-46fb-bf1b-62e21e5293f8"). InnerVolumeSpecName "kube-api-access-sgh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.148323 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts" (OuterVolumeSpecName: "scripts") pod "676bb34b-7c07-46fb-bf1b-62e21e5293f8" (UID: "676bb34b-7c07-46fb-bf1b-62e21e5293f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.174774 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data" (OuterVolumeSpecName: "config-data") pod "676bb34b-7c07-46fb-bf1b-62e21e5293f8" (UID: "676bb34b-7c07-46fb-bf1b-62e21e5293f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.191703 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "676bb34b-7c07-46fb-bf1b-62e21e5293f8" (UID: "676bb34b-7c07-46fb-bf1b-62e21e5293f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.242540 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.242592 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.242602 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676bb34b-7c07-46fb-bf1b-62e21e5293f8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.242616 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgh9p\" (UniqueName: \"kubernetes.io/projected/676bb34b-7c07-46fb-bf1b-62e21e5293f8-kube-api-access-sgh9p\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.691341 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" event={"ID":"676bb34b-7c07-46fb-bf1b-62e21e5293f8","Type":"ContainerDied","Data":"3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70"} Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.694309 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3702084fc92dd13af4255a2ac5d56816ab2c77c2d6ad2852308721c769956b70" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.691697 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rbmpj" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.784856 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 19:14:56 crc kubenswrapper[4731]: E1203 19:14:56.798471 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676bb34b-7c07-46fb-bf1b-62e21e5293f8" containerName="nova-cell1-conductor-db-sync" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.799093 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="676bb34b-7c07-46fb-bf1b-62e21e5293f8" containerName="nova-cell1-conductor-db-sync" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.851629 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="676bb34b-7c07-46fb-bf1b-62e21e5293f8" containerName="nova-cell1-conductor-db-sync" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.855150 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.855324 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.859857 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.961425 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjsr\" (UniqueName: \"kubernetes.io/projected/36c34f8f-b818-4241-a974-316d98a4eaca-kube-api-access-vnjsr\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.962102 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:56 crc kubenswrapper[4731]: I1203 19:14:56.962292 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.027999 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.028074 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.065052 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjsr\" (UniqueName: \"kubernetes.io/projected/36c34f8f-b818-4241-a974-316d98a4eaca-kube-api-access-vnjsr\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.065472 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.065646 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.073881 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.073940 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c34f8f-b818-4241-a974-316d98a4eaca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.083388 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjsr\" (UniqueName: \"kubernetes.io/projected/36c34f8f-b818-4241-a974-316d98a4eaca-kube-api-access-vnjsr\") pod \"nova-cell1-conductor-0\" (UID: \"36c34f8f-b818-4241-a974-316d98a4eaca\") " pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.213443 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.367544 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.475347 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts\") pod \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.475427 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgx86\" (UniqueName: \"kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86\") pod \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.475597 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle\") pod \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.475646 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data\") pod \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\" (UID: \"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0\") " Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.481992 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts" (OuterVolumeSpecName: "scripts") pod "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" (UID: "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.482774 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86" (OuterVolumeSpecName: "kube-api-access-tgx86") pod "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" (UID: "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0"). InnerVolumeSpecName "kube-api-access-tgx86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.528734 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data" (OuterVolumeSpecName: "config-data") pod "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" (UID: "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.535195 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" (UID: "7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.578703 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.578760 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.578773 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.578792 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgx86\" (UniqueName: \"kubernetes.io/projected/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0-kube-api-access-tgx86\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.720004 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5fcx2" event={"ID":"7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0","Type":"ContainerDied","Data":"8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956"} Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.720055 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ed36f91c09b4ba37b5aa6189aba94fe2aa646912189a29d1e6c056a1886c956" Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.720128 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5fcx2" Dec 03 19:14:57 crc kubenswrapper[4731]: W1203 19:14:57.777201 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36c34f8f_b818_4241_a974_316d98a4eaca.slice/crio-8f1ad4132871dc5332039c266876d84bf6ff7aa6227cb5e763b3cc35bcfd8015 WatchSource:0}: Error finding container 8f1ad4132871dc5332039c266876d84bf6ff7aa6227cb5e763b3cc35bcfd8015: Status 404 returned error can't find the container with id 8f1ad4132871dc5332039c266876d84bf6ff7aa6227cb5e763b3cc35bcfd8015 Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.779013 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.910764 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.911046 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-log" containerID="cri-o://5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe" gracePeriod=30 Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.911189 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-api" containerID="cri-o://d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba" gracePeriod=30 Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.928090 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:14:57 crc kubenswrapper[4731]: I1203 19:14:57.928412 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerName="nova-scheduler-scheduler" containerID="cri-o://2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" gracePeriod=30 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.044400 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.045020 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-log" containerID="cri-o://2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" gracePeriod=30 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.045546 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-metadata" containerID="cri-o://943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" gracePeriod=30 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.664159 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.701874 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data\") pod \"843462d1-0c61-42c1-93c7-63c4857a1f84\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.701943 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcj7j\" (UniqueName: \"kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j\") pod \"843462d1-0c61-42c1-93c7-63c4857a1f84\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.702001 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs\") pod \"843462d1-0c61-42c1-93c7-63c4857a1f84\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.702140 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs\") pod \"843462d1-0c61-42c1-93c7-63c4857a1f84\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.702167 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle\") pod \"843462d1-0c61-42c1-93c7-63c4857a1f84\" (UID: \"843462d1-0c61-42c1-93c7-63c4857a1f84\") " Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.704247 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs" (OuterVolumeSpecName: "logs") pod "843462d1-0c61-42c1-93c7-63c4857a1f84" (UID: "843462d1-0c61-42c1-93c7-63c4857a1f84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.709908 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j" (OuterVolumeSpecName: "kube-api-access-hcj7j") pod "843462d1-0c61-42c1-93c7-63c4857a1f84" (UID: "843462d1-0c61-42c1-93c7-63c4857a1f84"). InnerVolumeSpecName "kube-api-access-hcj7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.733533 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"36c34f8f-b818-4241-a974-316d98a4eaca","Type":"ContainerStarted","Data":"5f4948f9dea81aa0dae30f565978cbbff6413104034e133f76f2f4f053d367d9"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.733594 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"36c34f8f-b818-4241-a974-316d98a4eaca","Type":"ContainerStarted","Data":"8f1ad4132871dc5332039c266876d84bf6ff7aa6227cb5e763b3cc35bcfd8015"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.734306 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738522 4731 generic.go:334] "Generic (PLEG): container finished" podID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerID="943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" exitCode=0 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738557 4731 generic.go:334] "Generic (PLEG): container finished" podID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerID="2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" exitCode=143 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738623 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerDied","Data":"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738660 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerDied","Data":"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738671 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"843462d1-0c61-42c1-93c7-63c4857a1f84","Type":"ContainerDied","Data":"81b1ce3481029135477a2941279a12941d097158fb51f3b2ca398aff27289cc6"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738690 4731 scope.go:117] "RemoveContainer" containerID="943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.738855 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.745866 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "843462d1-0c61-42c1-93c7-63c4857a1f84" (UID: "843462d1-0c61-42c1-93c7-63c4857a1f84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.753187 4731 generic.go:334] "Generic (PLEG): container finished" podID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerID="5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe" exitCode=143 Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.753264 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerDied","Data":"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe"} Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.763063 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data" (OuterVolumeSpecName: "config-data") pod "843462d1-0c61-42c1-93c7-63c4857a1f84" (UID: "843462d1-0c61-42c1-93c7-63c4857a1f84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.768418 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "843462d1-0c61-42c1-93c7-63c4857a1f84" (UID: "843462d1-0c61-42c1-93c7-63c4857a1f84"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.768602 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.768582049 podStartE2EDuration="2.768582049s" podCreationTimestamp="2025-12-03 19:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:14:58.761450616 +0000 UTC m=+1219.360045080" watchObservedRunningTime="2025-12-03 19:14:58.768582049 +0000 UTC m=+1219.367176513" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.788397 4731 scope.go:117] "RemoveContainer" containerID="2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.807234 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.807307 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcj7j\" (UniqueName: \"kubernetes.io/projected/843462d1-0c61-42c1-93c7-63c4857a1f84-kube-api-access-hcj7j\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.807324 4731 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.807334 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843462d1-0c61-42c1-93c7-63c4857a1f84-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.807345 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843462d1-0c61-42c1-93c7-63c4857a1f84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.810331 4731 scope.go:117] "RemoveContainer" containerID="943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" Dec 03 19:14:58 crc kubenswrapper[4731]: E1203 19:14:58.810797 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c\": container with ID starting with 943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c not found: ID does not exist" containerID="943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.810832 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c"} err="failed to get container status \"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c\": rpc error: code = NotFound desc = could not find container \"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c\": container with ID starting with 943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c not found: ID does not exist" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.810886 4731 scope.go:117] "RemoveContainer" containerID="2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" Dec 03 19:14:58 crc kubenswrapper[4731]: E1203 19:14:58.811139 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697\": container with ID starting with 2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697 not found: ID does not exist" containerID="2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.811164 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697"} err="failed to get container status \"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697\": rpc error: code = NotFound desc = could not find container \"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697\": container with ID starting with 2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697 not found: ID does not exist" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.811180 4731 scope.go:117] "RemoveContainer" containerID="943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.811410 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c"} err="failed to get container status \"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c\": rpc error: code = NotFound desc = could not find container \"943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c\": container with ID starting with 943ee4b5882056222e9ada1e00171a9868a1a0a4808b2026d884f49c9006d35c not found: ID does not exist" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.811454 4731 scope.go:117] "RemoveContainer" containerID="2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697" Dec 03 19:14:58 crc kubenswrapper[4731]: I1203 19:14:58.811671 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697"} err="failed to get container status \"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697\": rpc error: code = NotFound desc = could not find container \"2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697\": container with ID starting with 2c61f036dec0c521fbcc6a49b573c0535d0c386484828da0cbab687b74bdb697 not found: ID does not exist" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.086579 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.107475 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.115504 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.116123 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-log" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116142 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-log" Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.116158 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" containerName="nova-manage" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116167 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" containerName="nova-manage" Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.116179 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-metadata" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116185 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-metadata" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116494 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-log" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116514 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" containerName="nova-manage" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.116533 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" containerName="nova-metadata-metadata" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.118003 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.122368 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.122510 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.132638 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.218956 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.219926 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ckdl\" (UniqueName: \"kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.220030 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.220103 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.220133 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.293667 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.302704 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.304655 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:14:59 crc kubenswrapper[4731]: E1203 19:14:59.304759 4731 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerName="nova-scheduler-scheduler" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.322292 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.322426 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.322461 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.322534 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.322559 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ckdl\" (UniqueName: \"kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.323208 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.327859 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.327859 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.330715 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.364588 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ckdl\" (UniqueName: \"kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl\") pod \"nova-metadata-0\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.520635 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:14:59 crc kubenswrapper[4731]: I1203 19:14:59.868333 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843462d1-0c61-42c1-93c7-63c4857a1f84" path="/var/lib/kubelet/pods/843462d1-0c61-42c1-93c7-63c4857a1f84/volumes" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.019519 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.144338 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf"] Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.146141 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.150959 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.151077 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.158586 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf"] Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.281691 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.281759 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.282432 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsbbt\" (UniqueName: \"kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.383944 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsbbt\" (UniqueName: \"kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.384475 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.384509 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.385579 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.390429 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.409228 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsbbt\" (UniqueName: \"kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt\") pod \"collect-profiles-29413155-4mnkf\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.493119 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.792526 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerStarted","Data":"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155"} Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.792578 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerStarted","Data":"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69"} Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.792589 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerStarted","Data":"b8345ded1c8b06ecab40f1977d0a65de307e1846df6c096371bf3bc1a5856564"} Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.830288 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.830262541 podStartE2EDuration="1.830262541s" podCreationTimestamp="2025-12-03 19:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:00.815923352 +0000 UTC m=+1221.414517816" watchObservedRunningTime="2025-12-03 19:15:00.830262541 +0000 UTC m=+1221.428857015" Dec 03 19:15:00 crc kubenswrapper[4731]: I1203 19:15:00.983242 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf"] Dec 03 19:15:01 crc kubenswrapper[4731]: E1203 19:15:01.418543 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd1e6d5_7cd4_4cb5_94a1_5a98c8ecabf7.slice/crio-conmon-d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.728483 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.763183 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827619 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data\") pod \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827680 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle\") pod \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827737 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle\") pod \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827772 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs\") pod \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827841 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data\") pod \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827872 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8h9z\" (UniqueName: \"kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z\") pod \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\" (UID: \"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.827907 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4t8l\" (UniqueName: \"kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l\") pod \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\" (UID: \"dde31fb3-591b-4f8c-af33-df37bb71c3a6\") " Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.842782 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs" (OuterVolumeSpecName: "logs") pod "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" (UID: "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.845780 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l" (OuterVolumeSpecName: "kube-api-access-z4t8l") pod "dde31fb3-591b-4f8c-af33-df37bb71c3a6" (UID: "dde31fb3-591b-4f8c-af33-df37bb71c3a6"). InnerVolumeSpecName "kube-api-access-z4t8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.852748 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z" (OuterVolumeSpecName: "kube-api-access-p8h9z") pod "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" (UID: "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7"). InnerVolumeSpecName "kube-api-access-p8h9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.882633 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" (UID: "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.883183 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data" (OuterVolumeSpecName: "config-data") pod "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" (UID: "fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.883296 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.883503 4731 generic.go:334] "Generic (PLEG): container finished" podID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" exitCode=0 Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.893172 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dde31fb3-591b-4f8c-af33-df37bb71c3a6" (UID: "dde31fb3-591b-4f8c-af33-df37bb71c3a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.908373 4731 generic.go:334] "Generic (PLEG): container finished" podID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerID="d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba" exitCode=0 Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.908528 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926626 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dde31fb3-591b-4f8c-af33-df37bb71c3a6","Type":"ContainerDied","Data":"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926686 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dde31fb3-591b-4f8c-af33-df37bb71c3a6","Type":"ContainerDied","Data":"623063e970510c484070e045b878d0728871ee2b436c4b2cf8229891caea7581"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926699 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerDied","Data":"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926715 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7","Type":"ContainerDied","Data":"2193ad7b91090e49f296851a0a61a9e88dd05f3cbfab658add4fa357efb518ae"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926725 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" event={"ID":"ea93acb8-78a3-40e3-9555-e98b23ec06c4","Type":"ContainerStarted","Data":"a56fc00a090ea81dd22d1e188010b38e4e40dd29152d94172ab2b190e9179fe7"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926738 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" event={"ID":"ea93acb8-78a3-40e3-9555-e98b23ec06c4","Type":"ContainerStarted","Data":"c93dbdc2a65393ac42bd2fabe8fccbcdd7ea0198129d72975b4b58ada098428d"} Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.926772 4731 scope.go:117] "RemoveContainer" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.934244 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.936491 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.936557 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.936568 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.936578 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8h9z\" (UniqueName: \"kubernetes.io/projected/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7-kube-api-access-p8h9z\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.936589 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4t8l\" (UniqueName: \"kubernetes.io/projected/dde31fb3-591b-4f8c-af33-df37bb71c3a6-kube-api-access-z4t8l\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.943372 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" podStartSLOduration=1.943349789 podStartE2EDuration="1.943349789s" podCreationTimestamp="2025-12-03 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:01.936962178 +0000 UTC m=+1222.535556642" watchObservedRunningTime="2025-12-03 19:15:01.943349789 +0000 UTC m=+1222.541944253" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.971893 4731 scope.go:117] "RemoveContainer" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" Dec 03 19:15:01 crc kubenswrapper[4731]: E1203 19:15:01.972549 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812\": container with ID starting with 2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812 not found: ID does not exist" containerID="2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.972612 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812"} err="failed to get container status \"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812\": rpc error: code = NotFound desc = could not find container \"2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812\": container with ID starting with 2d508b30d8627aa5e16a9ce5869c45db7b9369d0bd272d1cde5439d32abe4812 not found: ID does not exist" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.972665 4731 scope.go:117] "RemoveContainer" containerID="d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba" Dec 03 19:15:01 crc kubenswrapper[4731]: I1203 19:15:01.984999 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.002520 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data" (OuterVolumeSpecName: "config-data") pod "dde31fb3-591b-4f8c-af33-df37bb71c3a6" (UID: "dde31fb3-591b-4f8c-af33-df37bb71c3a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.014245 4731 scope.go:117] "RemoveContainer" containerID="5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.016336 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.034993 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: E1203 19:15:02.035622 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-api" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035643 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-api" Dec 03 19:15:02 crc kubenswrapper[4731]: E1203 19:15:02.035660 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-log" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035667 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-log" Dec 03 19:15:02 crc kubenswrapper[4731]: E1203 19:15:02.035679 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerName="nova-scheduler-scheduler" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035686 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerName="nova-scheduler-scheduler" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035907 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" containerName="nova-scheduler-scheduler" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035927 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-api" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.035942 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" containerName="nova-api-log" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.037162 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.040234 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde31fb3-591b-4f8c-af33-df37bb71c3a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.044529 4731 scope.go:117] "RemoveContainer" containerID="d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.044811 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.045092 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: E1203 19:15:02.046716 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba\": container with ID starting with d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba not found: ID does not exist" containerID="d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.046760 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba"} err="failed to get container status \"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba\": rpc error: code = NotFound desc = could not find container \"d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba\": container with ID starting with d4294271d90f46992d18d5cc97f9f8fac16dd213691a1a06d60707d88f7919ba not found: ID does not exist" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.046794 4731 scope.go:117] "RemoveContainer" containerID="5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe" Dec 03 19:15:02 crc kubenswrapper[4731]: E1203 19:15:02.048174 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe\": container with ID starting with 5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe not found: ID does not exist" containerID="5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.048346 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe"} err="failed to get container status \"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe\": rpc error: code = NotFound desc = could not find container \"5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe\": container with ID starting with 5d0d8009bcdea215529918c40f751755b14cba4012ac68869ea209c49d91e7fe not found: ID does not exist" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.142497 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.142636 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtsst\" (UniqueName: \"kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.142666 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.142700 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.221227 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.237079 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.246383 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtsst\" (UniqueName: \"kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.246457 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.246506 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.246580 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.247095 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.254370 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.257025 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.258651 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.260874 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.261339 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.264503 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtsst\" (UniqueName: \"kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst\") pod \"nova-api-0\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.287143 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.298341 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.348137 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.348295 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.348347 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxncs\" (UniqueName: \"kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.369128 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.450549 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.451114 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.451170 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxncs\" (UniqueName: \"kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.457803 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.464144 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.474184 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxncs\" (UniqueName: \"kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs\") pod \"nova-scheduler-0\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.683780 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.876463 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.927930 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerStarted","Data":"34969ce44eaf702f08bc265dd826e74ac8d8a70ee0e83c4e217486a353c138b1"} Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.930544 4731 generic.go:334] "Generic (PLEG): container finished" podID="ea93acb8-78a3-40e3-9555-e98b23ec06c4" containerID="a56fc00a090ea81dd22d1e188010b38e4e40dd29152d94172ab2b190e9179fe7" exitCode=0 Dec 03 19:15:02 crc kubenswrapper[4731]: I1203 19:15:02.930615 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" event={"ID":"ea93acb8-78a3-40e3-9555-e98b23ec06c4","Type":"ContainerDied","Data":"a56fc00a090ea81dd22d1e188010b38e4e40dd29152d94172ab2b190e9179fe7"} Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.163913 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.872064 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde31fb3-591b-4f8c-af33-df37bb71c3a6" path="/var/lib/kubelet/pods/dde31fb3-591b-4f8c-af33-df37bb71c3a6/volumes" Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.872684 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7" path="/var/lib/kubelet/pods/fcd1e6d5-7cd4-4cb5-94a1-5a98c8ecabf7/volumes" Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.947251 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32385ead-eb98-476c-89e5-5d61b693d178","Type":"ContainerStarted","Data":"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9"} Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.947341 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32385ead-eb98-476c-89e5-5d61b693d178","Type":"ContainerStarted","Data":"c99671e8eb896dd6afd462368158a8d2073de1b17055ca1d3308190f9df38e26"} Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.951610 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerStarted","Data":"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34"} Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.951892 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerStarted","Data":"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d"} Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.968714 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.968682862 podStartE2EDuration="1.968682862s" podCreationTimestamp="2025-12-03 19:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:03.966981528 +0000 UTC m=+1224.565576012" watchObservedRunningTime="2025-12-03 19:15:03.968682862 +0000 UTC m=+1224.567277336" Dec 03 19:15:03 crc kubenswrapper[4731]: I1203 19:15:03.989889 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.989858055 podStartE2EDuration="2.989858055s" podCreationTimestamp="2025-12-03 19:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:03.984858238 +0000 UTC m=+1224.583452712" watchObservedRunningTime="2025-12-03 19:15:03.989858055 +0000 UTC m=+1224.588452519" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.372903 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.502958 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume\") pod \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.503052 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsbbt\" (UniqueName: \"kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt\") pod \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.503188 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume\") pod \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\" (UID: \"ea93acb8-78a3-40e3-9555-e98b23ec06c4\") " Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.504024 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea93acb8-78a3-40e3-9555-e98b23ec06c4" (UID: "ea93acb8-78a3-40e3-9555-e98b23ec06c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.508714 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea93acb8-78a3-40e3-9555-e98b23ec06c4" (UID: "ea93acb8-78a3-40e3-9555-e98b23ec06c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.516970 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt" (OuterVolumeSpecName: "kube-api-access-vsbbt") pod "ea93acb8-78a3-40e3-9555-e98b23ec06c4" (UID: "ea93acb8-78a3-40e3-9555-e98b23ec06c4"). InnerVolumeSpecName "kube-api-access-vsbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.521036 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.521148 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.605567 4731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea93acb8-78a3-40e3-9555-e98b23ec06c4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.605605 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea93acb8-78a3-40e3-9555-e98b23ec06c4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.605616 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsbbt\" (UniqueName: \"kubernetes.io/projected/ea93acb8-78a3-40e3-9555-e98b23ec06c4-kube-api-access-vsbbt\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.710541 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.977341 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.985605 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413155-4mnkf" event={"ID":"ea93acb8-78a3-40e3-9555-e98b23ec06c4","Type":"ContainerDied","Data":"c93dbdc2a65393ac42bd2fabe8fccbcdd7ea0198129d72975b4b58ada098428d"} Dec 03 19:15:04 crc kubenswrapper[4731]: I1203 19:15:04.985681 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c93dbdc2a65393ac42bd2fabe8fccbcdd7ea0198129d72975b4b58ada098428d" Dec 03 19:15:07 crc kubenswrapper[4731]: I1203 19:15:07.684777 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 19:15:09 crc kubenswrapper[4731]: I1203 19:15:09.521546 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 19:15:09 crc kubenswrapper[4731]: I1203 19:15:09.521939 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 19:15:10 crc kubenswrapper[4731]: I1203 19:15:10.539469 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:10 crc kubenswrapper[4731]: I1203 19:15:10.539469 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:12 crc kubenswrapper[4731]: I1203 19:15:12.370079 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:12 crc kubenswrapper[4731]: I1203 19:15:12.370541 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:12 crc kubenswrapper[4731]: I1203 19:15:12.685638 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 19:15:12 crc kubenswrapper[4731]: I1203 19:15:12.721815 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 19:15:13 crc kubenswrapper[4731]: I1203 19:15:13.079871 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 19:15:13 crc kubenswrapper[4731]: I1203 19:15:13.412663 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:13 crc kubenswrapper[4731]: I1203 19:15:13.412722 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:19 crc kubenswrapper[4731]: I1203 19:15:19.540567 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 19:15:19 crc kubenswrapper[4731]: I1203 19:15:19.541765 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 19:15:19 crc kubenswrapper[4731]: I1203 19:15:19.549994 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 19:15:19 crc kubenswrapper[4731]: I1203 19:15:19.550295 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.060675 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.129238 4731 generic.go:334] "Generic (PLEG): container finished" podID="9bef1af4-84af-4242-b22e-5673b6fc209a" containerID="9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8" exitCode=137 Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.129364 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.129401 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bef1af4-84af-4242-b22e-5673b6fc209a","Type":"ContainerDied","Data":"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8"} Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.129492 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9bef1af4-84af-4242-b22e-5673b6fc209a","Type":"ContainerDied","Data":"a31f250ceec446c2b72ec64444ac94415e795fdca8871e1b07a24f73c96a91f6"} Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.129527 4731 scope.go:117] "RemoveContainer" containerID="9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.164248 4731 scope.go:117] "RemoveContainer" containerID="9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8" Dec 03 19:15:20 crc kubenswrapper[4731]: E1203 19:15:20.164988 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8\": container with ID starting with 9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8 not found: ID does not exist" containerID="9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.165032 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8"} err="failed to get container status \"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8\": rpc error: code = NotFound desc = could not find container \"9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8\": container with ID starting with 9f17e00f1ef18d675f514343a01dd295f5a584446fb6c4102cdfb71a36e1a4a8 not found: ID does not exist" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.175089 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6wkm\" (UniqueName: \"kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm\") pod \"9bef1af4-84af-4242-b22e-5673b6fc209a\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.175243 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data\") pod \"9bef1af4-84af-4242-b22e-5673b6fc209a\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.175489 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle\") pod \"9bef1af4-84af-4242-b22e-5673b6fc209a\" (UID: \"9bef1af4-84af-4242-b22e-5673b6fc209a\") " Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.182062 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm" (OuterVolumeSpecName: "kube-api-access-q6wkm") pod "9bef1af4-84af-4242-b22e-5673b6fc209a" (UID: "9bef1af4-84af-4242-b22e-5673b6fc209a"). InnerVolumeSpecName "kube-api-access-q6wkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.216786 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data" (OuterVolumeSpecName: "config-data") pod "9bef1af4-84af-4242-b22e-5673b6fc209a" (UID: "9bef1af4-84af-4242-b22e-5673b6fc209a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.217580 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bef1af4-84af-4242-b22e-5673b6fc209a" (UID: "9bef1af4-84af-4242-b22e-5673b6fc209a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.279188 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6wkm\" (UniqueName: \"kubernetes.io/projected/9bef1af4-84af-4242-b22e-5673b6fc209a-kube-api-access-q6wkm\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.279280 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.279304 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bef1af4-84af-4242-b22e-5673b6fc209a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.462533 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.472217 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.496990 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:15:20 crc kubenswrapper[4731]: E1203 19:15:20.497534 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea93acb8-78a3-40e3-9555-e98b23ec06c4" containerName="collect-profiles" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.497556 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea93acb8-78a3-40e3-9555-e98b23ec06c4" containerName="collect-profiles" Dec 03 19:15:20 crc kubenswrapper[4731]: E1203 19:15:20.497570 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bef1af4-84af-4242-b22e-5673b6fc209a" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.497581 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bef1af4-84af-4242-b22e-5673b6fc209a" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.497774 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea93acb8-78a3-40e3-9555-e98b23ec06c4" containerName="collect-profiles" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.497800 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bef1af4-84af-4242-b22e-5673b6fc209a" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.498641 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.501799 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.503888 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.503931 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.514986 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.688241 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.688424 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.688492 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.688537 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.688584 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbd57\" (UniqueName: \"kubernetes.io/projected/170f66c2-939d-42f6-af1e-f28c9cc92a71-kube-api-access-tbd57\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.790607 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.790669 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.790700 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbd57\" (UniqueName: \"kubernetes.io/projected/170f66c2-939d-42f6-af1e-f28c9cc92a71-kube-api-access-tbd57\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.790776 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.790845 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.797439 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.798166 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.799738 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.800651 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/170f66c2-939d-42f6-af1e-f28c9cc92a71-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.809032 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbd57\" (UniqueName: \"kubernetes.io/projected/170f66c2-939d-42f6-af1e-f28c9cc92a71-kube-api-access-tbd57\") pod \"nova-cell1-novncproxy-0\" (UID: \"170f66c2-939d-42f6-af1e-f28c9cc92a71\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:20 crc kubenswrapper[4731]: I1203 19:15:20.830237 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:21 crc kubenswrapper[4731]: I1203 19:15:21.340361 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 19:15:21 crc kubenswrapper[4731]: I1203 19:15:21.874062 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bef1af4-84af-4242-b22e-5673b6fc209a" path="/var/lib/kubelet/pods/9bef1af4-84af-4242-b22e-5673b6fc209a/volumes" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.149159 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"170f66c2-939d-42f6-af1e-f28c9cc92a71","Type":"ContainerStarted","Data":"41a8afc6d864b5a72afcb0095c8afd099473d4e05b9c81d374a3b1cecf59eedf"} Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.149224 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"170f66c2-939d-42f6-af1e-f28c9cc92a71","Type":"ContainerStarted","Data":"f463f04e0714a21af78ab01dfbf1e7b9b0cc50e626d01a52563530ff324159a9"} Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.180541 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.18051678 podStartE2EDuration="2.18051678s" podCreationTimestamp="2025-12-03 19:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:22.172002403 +0000 UTC m=+1242.770596887" watchObservedRunningTime="2025-12-03 19:15:22.18051678 +0000 UTC m=+1242.779111244" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.373824 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.374127 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.374434 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.374784 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.378385 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 19:15:22 crc kubenswrapper[4731]: I1203 19:15:22.384630 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 19:15:24 crc kubenswrapper[4731]: I1203 19:15:24.860064 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:24 crc kubenswrapper[4731]: I1203 19:15:24.860946 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-central-agent" containerID="cri-o://45621dbe8ef595d6d18625ea022f2bb8ac1d6970a725152259d0be783ce712e6" gracePeriod=30 Dec 03 19:15:24 crc kubenswrapper[4731]: I1203 19:15:24.861022 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-notification-agent" containerID="cri-o://b13891c28077988b9b36e217a84ae66e9fd568464c62b4dcf4d5217a02b52ada" gracePeriod=30 Dec 03 19:15:24 crc kubenswrapper[4731]: I1203 19:15:24.861030 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="proxy-httpd" containerID="cri-o://c5233383df8d91e54cf1617280fa040fb23c8050974bf379848173c4796f6fd0" gracePeriod=30 Dec 03 19:15:24 crc kubenswrapper[4731]: I1203 19:15:24.861053 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="sg-core" containerID="cri-o://2687bbaf72399581f9154a0cf2f2ef71f6f5de7bc3d48508e68558b27a00c9bd" gracePeriod=30 Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.197306 4731 generic.go:334] "Generic (PLEG): container finished" podID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerID="c5233383df8d91e54cf1617280fa040fb23c8050974bf379848173c4796f6fd0" exitCode=0 Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.197864 4731 generic.go:334] "Generic (PLEG): container finished" podID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerID="2687bbaf72399581f9154a0cf2f2ef71f6f5de7bc3d48508e68558b27a00c9bd" exitCode=2 Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.197891 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerDied","Data":"c5233383df8d91e54cf1617280fa040fb23c8050974bf379848173c4796f6fd0"} Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.197940 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerDied","Data":"2687bbaf72399581f9154a0cf2f2ef71f6f5de7bc3d48508e68558b27a00c9bd"} Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.246989 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.247337 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-log" containerID="cri-o://6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d" gracePeriod=30 Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.247976 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-api" containerID="cri-o://39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34" gracePeriod=30 Dec 03 19:15:25 crc kubenswrapper[4731]: I1203 19:15:25.830935 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.212341 4731 generic.go:334] "Generic (PLEG): container finished" podID="3319b516-34ef-4ddf-8939-da2d140e015d" containerID="6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d" exitCode=143 Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.212434 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerDied","Data":"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d"} Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.216511 4731 generic.go:334] "Generic (PLEG): container finished" podID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerID="45621dbe8ef595d6d18625ea022f2bb8ac1d6970a725152259d0be783ce712e6" exitCode=0 Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.216542 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerDied","Data":"45621dbe8ef595d6d18625ea022f2bb8ac1d6970a725152259d0be783ce712e6"} Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.468424 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:15:26 crc kubenswrapper[4731]: I1203 19:15:26.468505 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:15:28 crc kubenswrapper[4731]: I1203 19:15:28.916190 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.111394 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data\") pod \"3319b516-34ef-4ddf-8939-da2d140e015d\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.111540 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtsst\" (UniqueName: \"kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst\") pod \"3319b516-34ef-4ddf-8939-da2d140e015d\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.111576 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle\") pod \"3319b516-34ef-4ddf-8939-da2d140e015d\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.111646 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs\") pod \"3319b516-34ef-4ddf-8939-da2d140e015d\" (UID: \"3319b516-34ef-4ddf-8939-da2d140e015d\") " Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.112569 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs" (OuterVolumeSpecName: "logs") pod "3319b516-34ef-4ddf-8939-da2d140e015d" (UID: "3319b516-34ef-4ddf-8939-da2d140e015d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.121082 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst" (OuterVolumeSpecName: "kube-api-access-vtsst") pod "3319b516-34ef-4ddf-8939-da2d140e015d" (UID: "3319b516-34ef-4ddf-8939-da2d140e015d"). InnerVolumeSpecName "kube-api-access-vtsst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.184145 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data" (OuterVolumeSpecName: "config-data") pod "3319b516-34ef-4ddf-8939-da2d140e015d" (UID: "3319b516-34ef-4ddf-8939-da2d140e015d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.186350 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3319b516-34ef-4ddf-8939-da2d140e015d" (UID: "3319b516-34ef-4ddf-8939-da2d140e015d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.214636 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtsst\" (UniqueName: \"kubernetes.io/projected/3319b516-34ef-4ddf-8939-da2d140e015d-kube-api-access-vtsst\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.214674 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.214688 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3319b516-34ef-4ddf-8939-da2d140e015d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.214702 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3319b516-34ef-4ddf-8939-da2d140e015d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.249492 4731 generic.go:334] "Generic (PLEG): container finished" podID="3319b516-34ef-4ddf-8939-da2d140e015d" containerID="39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34" exitCode=0 Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.249547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerDied","Data":"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34"} Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.249583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3319b516-34ef-4ddf-8939-da2d140e015d","Type":"ContainerDied","Data":"34969ce44eaf702f08bc265dd826e74ac8d8a70ee0e83c4e217486a353c138b1"} Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.249606 4731 scope.go:117] "RemoveContainer" containerID="39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.249859 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.281137 4731 scope.go:117] "RemoveContainer" containerID="6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.293573 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.308814 4731 scope.go:117] "RemoveContainer" containerID="39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34" Dec 03 19:15:29 crc kubenswrapper[4731]: E1203 19:15:29.309223 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34\": container with ID starting with 39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34 not found: ID does not exist" containerID="39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.309290 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34"} err="failed to get container status \"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34\": rpc error: code = NotFound desc = could not find container \"39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34\": container with ID starting with 39da7ace91e183b151e6218a05238c77c3f56a406f47c6b5ce144977ea6cae34 not found: ID does not exist" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.309319 4731 scope.go:117] "RemoveContainer" containerID="6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.309930 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:29 crc kubenswrapper[4731]: E1203 19:15:29.310202 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d\": container with ID starting with 6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d not found: ID does not exist" containerID="6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.310245 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d"} err="failed to get container status \"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d\": rpc error: code = NotFound desc = could not find container \"6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d\": container with ID starting with 6363dc3580dfa188556ec8abb1c179273a5de9935485465ff2f1bc40d411ec4d not found: ID does not exist" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.321669 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:29 crc kubenswrapper[4731]: E1203 19:15:29.322307 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-log" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.322330 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-log" Dec 03 19:15:29 crc kubenswrapper[4731]: E1203 19:15:29.322368 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-api" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.322376 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-api" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.322554 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-api" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.322578 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" containerName="nova-api-log" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.323741 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.328029 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.328103 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.328361 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.331944 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.524966 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.526002 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.526049 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np428\" (UniqueName: \"kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.526220 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.526302 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.526598 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.628828 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.628922 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.628946 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np428\" (UniqueName: \"kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.629026 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.629049 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.629114 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.630345 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.637116 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.637889 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.639011 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.649573 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.653336 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np428\" (UniqueName: \"kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428\") pod \"nova-api-0\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " pod="openstack/nova-api-0" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.893688 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3319b516-34ef-4ddf-8939-da2d140e015d" path="/var/lib/kubelet/pods/3319b516-34ef-4ddf-8939-da2d140e015d/volumes" Dec 03 19:15:29 crc kubenswrapper[4731]: I1203 19:15:29.947044 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.265431 4731 generic.go:334] "Generic (PLEG): container finished" podID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerID="b13891c28077988b9b36e217a84ae66e9fd568464c62b4dcf4d5217a02b52ada" exitCode=0 Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.265512 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerDied","Data":"b13891c28077988b9b36e217a84ae66e9fd568464c62b4dcf4d5217a02b52ada"} Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.412159 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.545432 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.548977 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549034 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549054 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549222 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549345 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549381 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549455 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f49jw\" (UniqueName: \"kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.549502 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data\") pod \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\" (UID: \"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d\") " Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.553922 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.554542 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.560111 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw" (OuterVolumeSpecName: "kube-api-access-f49jw") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "kube-api-access-f49jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.570479 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts" (OuterVolumeSpecName: "scripts") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.602856 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653217 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653670 4731 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653697 4731 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653707 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653717 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653727 4731 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.653736 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f49jw\" (UniqueName: \"kubernetes.io/projected/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-kube-api-access-f49jw\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.688046 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.726465 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data" (OuterVolumeSpecName: "config-data") pod "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" (UID: "7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.755423 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.755469 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.831808 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:30 crc kubenswrapper[4731]: I1203 19:15:30.862237 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.281423 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d","Type":"ContainerDied","Data":"7d2eb7f22d213a21f5080fb056c75f653081f08ddd3708447469138e56620ea6"} Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.281556 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.281900 4731 scope.go:117] "RemoveContainer" containerID="c5233383df8d91e54cf1617280fa040fb23c8050974bf379848173c4796f6fd0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.284425 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerStarted","Data":"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b"} Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.284509 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerStarted","Data":"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952"} Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.284523 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerStarted","Data":"c4e29d31f28dabc377397d80861561502864df619d91f93ddac6afa7475c526f"} Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.302576 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.320945 4731 scope.go:117] "RemoveContainer" containerID="2687bbaf72399581f9154a0cf2f2ef71f6f5de7bc3d48508e68558b27a00c9bd" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.321343 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.321331109 podStartE2EDuration="2.321331109s" podCreationTimestamp="2025-12-03 19:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:31.31527041 +0000 UTC m=+1251.913864864" watchObservedRunningTime="2025-12-03 19:15:31.321331109 +0000 UTC m=+1251.919925563" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.346059 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.356699 4731 scope.go:117] "RemoveContainer" containerID="b13891c28077988b9b36e217a84ae66e9fd568464c62b4dcf4d5217a02b52ada" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.369081 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.390343 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:31 crc kubenswrapper[4731]: E1203 19:15:31.390946 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-central-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.390967 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-central-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: E1203 19:15:31.390981 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="sg-core" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.390990 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="sg-core" Dec 03 19:15:31 crc kubenswrapper[4731]: E1203 19:15:31.391007 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-notification-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391013 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-notification-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: E1203 19:15:31.391040 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="proxy-httpd" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391047 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="proxy-httpd" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391243 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-notification-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391282 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="sg-core" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391301 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="ceilometer-central-agent" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.391316 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" containerName="proxy-httpd" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.392586 4731 scope.go:117] "RemoveContainer" containerID="45621dbe8ef595d6d18625ea022f2bb8ac1d6970a725152259d0be783ce712e6" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.393436 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.397169 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.397215 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.398339 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.451370 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.567130 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-m77rq"] Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.569509 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.573119 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.573316 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.573744 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-config-data\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.573907 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574086 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-run-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574287 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-scripts\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574518 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-log-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574601 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574639 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.574662 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9bc\" (UniqueName: \"kubernetes.io/projected/6948a0b4-f3c6-483a-b304-8c5cefee3e31-kube-api-access-5l9bc\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.585098 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-m77rq"] Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677161 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-scripts\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677240 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677288 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-log-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677319 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677343 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677365 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9bc\" (UniqueName: \"kubernetes.io/projected/6948a0b4-f3c6-483a-b304-8c5cefee3e31-kube-api-access-5l9bc\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677413 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-config-data\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677443 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677692 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677910 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-log-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.677951 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9sf\" (UniqueName: \"kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.678045 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-run-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.678111 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.678495 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6948a0b4-f3c6-483a-b304-8c5cefee3e31-run-httpd\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.683343 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.683445 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.684834 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.685591 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-scripts\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.686416 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6948a0b4-f3c6-483a-b304-8c5cefee3e31-config-data\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.696075 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9bc\" (UniqueName: \"kubernetes.io/projected/6948a0b4-f3c6-483a-b304-8c5cefee3e31-kube-api-access-5l9bc\") pod \"ceilometer-0\" (UID: \"6948a0b4-f3c6-483a-b304-8c5cefee3e31\") " pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.757230 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.780155 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m9sf\" (UniqueName: \"kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.780704 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.780772 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.780853 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.785081 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.785984 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.787883 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.798648 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m9sf\" (UniqueName: \"kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf\") pod \"nova-cell1-cell-mapping-m77rq\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.886414 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d" path="/var/lib/kubelet/pods/7bf5a25e-e86a-4a4d-b7b8-c23d36ef531d/volumes" Dec 03 19:15:31 crc kubenswrapper[4731]: I1203 19:15:31.899945 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:32 crc kubenswrapper[4731]: I1203 19:15:32.305640 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 19:15:32 crc kubenswrapper[4731]: I1203 19:15:32.443740 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-m77rq"] Dec 03 19:15:32 crc kubenswrapper[4731]: W1203 19:15:32.451227 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01a4b141_8350_4132_8173_04c0dc9cc328.slice/crio-ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc WatchSource:0}: Error finding container ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc: Status 404 returned error can't find the container with id ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc Dec 03 19:15:33 crc kubenswrapper[4731]: I1203 19:15:33.318491 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6948a0b4-f3c6-483a-b304-8c5cefee3e31","Type":"ContainerStarted","Data":"b8757ab46912bbd15ed93b6ee7c8f8f6aa80250ef06ff792120f23dc97ec2808"} Dec 03 19:15:33 crc kubenswrapper[4731]: I1203 19:15:33.319042 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6948a0b4-f3c6-483a-b304-8c5cefee3e31","Type":"ContainerStarted","Data":"415d60515deefbbd9e6a0ca8589384650430c66770f68af8b1968f28dc1cfd99"} Dec 03 19:15:33 crc kubenswrapper[4731]: I1203 19:15:33.321217 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m77rq" event={"ID":"01a4b141-8350-4132-8173-04c0dc9cc328","Type":"ContainerStarted","Data":"e61e1ffa8e0f12f675542c84b1c21b31711ca6f6ca3d838a6c44803978f355f8"} Dec 03 19:15:33 crc kubenswrapper[4731]: I1203 19:15:33.321274 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m77rq" event={"ID":"01a4b141-8350-4132-8173-04c0dc9cc328","Type":"ContainerStarted","Data":"ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc"} Dec 03 19:15:33 crc kubenswrapper[4731]: I1203 19:15:33.348243 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-m77rq" podStartSLOduration=2.3482222139999998 podStartE2EDuration="2.348222214s" podCreationTimestamp="2025-12-03 19:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:33.344046373 +0000 UTC m=+1253.942640837" watchObservedRunningTime="2025-12-03 19:15:33.348222214 +0000 UTC m=+1253.946816678" Dec 03 19:15:34 crc kubenswrapper[4731]: I1203 19:15:34.339464 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6948a0b4-f3c6-483a-b304-8c5cefee3e31","Type":"ContainerStarted","Data":"debfc2b538108035d87e2f0a5f78456f667333f84b2b8179b0d96ce896d7aa6d"} Dec 03 19:15:35 crc kubenswrapper[4731]: I1203 19:15:35.355183 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6948a0b4-f3c6-483a-b304-8c5cefee3e31","Type":"ContainerStarted","Data":"2bce5cafa4042ce0c2984354078cac31006257a1cebbf14cebae33761ad6ea4f"} Dec 03 19:15:36 crc kubenswrapper[4731]: I1203 19:15:36.392784 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6948a0b4-f3c6-483a-b304-8c5cefee3e31","Type":"ContainerStarted","Data":"21bf47f7cf6e2e03b9fee9a391bfbcad8e49c6a2ce159dfe4078160e1db8f5d6"} Dec 03 19:15:36 crc kubenswrapper[4731]: I1203 19:15:36.393492 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 19:15:36 crc kubenswrapper[4731]: I1203 19:15:36.424237 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.329019411 podStartE2EDuration="5.424193697s" podCreationTimestamp="2025-12-03 19:15:31 +0000 UTC" firstStartedPulling="2025-12-03 19:15:32.308723368 +0000 UTC m=+1252.907317832" lastFinishedPulling="2025-12-03 19:15:35.403897654 +0000 UTC m=+1256.002492118" observedRunningTime="2025-12-03 19:15:36.418049164 +0000 UTC m=+1257.016643628" watchObservedRunningTime="2025-12-03 19:15:36.424193697 +0000 UTC m=+1257.022788161" Dec 03 19:15:39 crc kubenswrapper[4731]: I1203 19:15:39.427341 4731 generic.go:334] "Generic (PLEG): container finished" podID="01a4b141-8350-4132-8173-04c0dc9cc328" containerID="e61e1ffa8e0f12f675542c84b1c21b31711ca6f6ca3d838a6c44803978f355f8" exitCode=0 Dec 03 19:15:39 crc kubenswrapper[4731]: I1203 19:15:39.427902 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m77rq" event={"ID":"01a4b141-8350-4132-8173-04c0dc9cc328","Type":"ContainerDied","Data":"e61e1ffa8e0f12f675542c84b1c21b31711ca6f6ca3d838a6c44803978f355f8"} Dec 03 19:15:39 crc kubenswrapper[4731]: I1203 19:15:39.948573 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:39 crc kubenswrapper[4731]: I1203 19:15:39.948675 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.874373 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.923156 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data\") pod \"01a4b141-8350-4132-8173-04c0dc9cc328\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.923227 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m9sf\" (UniqueName: \"kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf\") pod \"01a4b141-8350-4132-8173-04c0dc9cc328\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.923308 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts\") pod \"01a4b141-8350-4132-8173-04c0dc9cc328\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.923448 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle\") pod \"01a4b141-8350-4132-8173-04c0dc9cc328\" (UID: \"01a4b141-8350-4132-8173-04c0dc9cc328\") " Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.931942 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf" (OuterVolumeSpecName: "kube-api-access-8m9sf") pod "01a4b141-8350-4132-8173-04c0dc9cc328" (UID: "01a4b141-8350-4132-8173-04c0dc9cc328"). InnerVolumeSpecName "kube-api-access-8m9sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.933458 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts" (OuterVolumeSpecName: "scripts") pod "01a4b141-8350-4132-8173-04c0dc9cc328" (UID: "01a4b141-8350-4132-8173-04c0dc9cc328"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.954664 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.955120 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.963927 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data" (OuterVolumeSpecName: "config-data") pod "01a4b141-8350-4132-8173-04c0dc9cc328" (UID: "01a4b141-8350-4132-8173-04c0dc9cc328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:40 crc kubenswrapper[4731]: I1203 19:15:40.968969 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01a4b141-8350-4132-8173-04c0dc9cc328" (UID: "01a4b141-8350-4132-8173-04c0dc9cc328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.026936 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.027027 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m9sf\" (UniqueName: \"kubernetes.io/projected/01a4b141-8350-4132-8173-04c0dc9cc328-kube-api-access-8m9sf\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.027069 4731 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.027082 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a4b141-8350-4132-8173-04c0dc9cc328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.457910 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m77rq" event={"ID":"01a4b141-8350-4132-8173-04c0dc9cc328","Type":"ContainerDied","Data":"ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc"} Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.457961 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5bac8af0349fb8136379e8e8916d96360dd55cf93b1201ff9080ca59fd89dc" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.458034 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m77rq" Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.658671 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.659076 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-log" containerID="cri-o://d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952" gracePeriod=30 Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.659170 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-api" containerID="cri-o://d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b" gracePeriod=30 Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.677594 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.678149 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="32385ead-eb98-476c-89e5-5d61b693d178" containerName="nova-scheduler-scheduler" containerID="cri-o://5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" gracePeriod=30 Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.712401 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.712713 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" containerID="cri-o://44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69" gracePeriod=30 Dec 03 19:15:41 crc kubenswrapper[4731]: I1203 19:15:41.712911 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" containerID="cri-o://1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155" gracePeriod=30 Dec 03 19:15:42 crc kubenswrapper[4731]: I1203 19:15:42.480406 4731 generic.go:334] "Generic (PLEG): container finished" podID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerID="d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952" exitCode=143 Dec 03 19:15:42 crc kubenswrapper[4731]: I1203 19:15:42.480511 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerDied","Data":"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952"} Dec 03 19:15:42 crc kubenswrapper[4731]: I1203 19:15:42.483785 4731 generic.go:334] "Generic (PLEG): container finished" podID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerID="44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69" exitCode=143 Dec 03 19:15:42 crc kubenswrapper[4731]: I1203 19:15:42.483818 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerDied","Data":"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69"} Dec 03 19:15:42 crc kubenswrapper[4731]: E1203 19:15:42.687555 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:15:42 crc kubenswrapper[4731]: E1203 19:15:42.689422 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:15:42 crc kubenswrapper[4731]: E1203 19:15:42.691098 4731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 19:15:42 crc kubenswrapper[4731]: E1203 19:15:42.691170 4731 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="32385ead-eb98-476c-89e5-5d61b693d178" containerName="nova-scheduler-scheduler" Dec 03 19:15:44 crc kubenswrapper[4731]: I1203 19:15:44.869742 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": read tcp 10.217.0.2:35458->10.217.0.182:8775: read: connection reset by peer" Dec 03 19:15:44 crc kubenswrapper[4731]: I1203 19:15:44.869746 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.182:8775/\": read tcp 10.217.0.2:35464->10.217.0.182:8775: read: connection reset by peer" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.406423 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.522401 4731 generic.go:334] "Generic (PLEG): container finished" podID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerID="1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155" exitCode=0 Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.522463 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerDied","Data":"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155"} Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.522506 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5a9b5335-b617-430e-bd5b-0dbbac80d148","Type":"ContainerDied","Data":"b8345ded1c8b06ecab40f1977d0a65de307e1846df6c096371bf3bc1a5856564"} Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.522529 4731 scope.go:117] "RemoveContainer" containerID="1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.522721 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.532287 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data\") pod \"5a9b5335-b617-430e-bd5b-0dbbac80d148\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.532363 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs\") pod \"5a9b5335-b617-430e-bd5b-0dbbac80d148\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.532513 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs\") pod \"5a9b5335-b617-430e-bd5b-0dbbac80d148\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.532555 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle\") pod \"5a9b5335-b617-430e-bd5b-0dbbac80d148\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.532683 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ckdl\" (UniqueName: \"kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl\") pod \"5a9b5335-b617-430e-bd5b-0dbbac80d148\" (UID: \"5a9b5335-b617-430e-bd5b-0dbbac80d148\") " Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.533259 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs" (OuterVolumeSpecName: "logs") pod "5a9b5335-b617-430e-bd5b-0dbbac80d148" (UID: "5a9b5335-b617-430e-bd5b-0dbbac80d148"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.550040 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl" (OuterVolumeSpecName: "kube-api-access-7ckdl") pod "5a9b5335-b617-430e-bd5b-0dbbac80d148" (UID: "5a9b5335-b617-430e-bd5b-0dbbac80d148"). InnerVolumeSpecName "kube-api-access-7ckdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.576405 4731 scope.go:117] "RemoveContainer" containerID="44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.578548 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a9b5335-b617-430e-bd5b-0dbbac80d148" (UID: "5a9b5335-b617-430e-bd5b-0dbbac80d148"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.592239 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data" (OuterVolumeSpecName: "config-data") pod "5a9b5335-b617-430e-bd5b-0dbbac80d148" (UID: "5a9b5335-b617-430e-bd5b-0dbbac80d148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.593943 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5a9b5335-b617-430e-bd5b-0dbbac80d148" (UID: "5a9b5335-b617-430e-bd5b-0dbbac80d148"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.636947 4731 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.636993 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.637003 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ckdl\" (UniqueName: \"kubernetes.io/projected/5a9b5335-b617-430e-bd5b-0dbbac80d148-kube-api-access-7ckdl\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.637012 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9b5335-b617-430e-bd5b-0dbbac80d148-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.637022 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9b5335-b617-430e-bd5b-0dbbac80d148-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.666883 4731 scope.go:117] "RemoveContainer" containerID="1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155" Dec 03 19:15:45 crc kubenswrapper[4731]: E1203 19:15:45.667477 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155\": container with ID starting with 1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155 not found: ID does not exist" containerID="1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.667544 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155"} err="failed to get container status \"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155\": rpc error: code = NotFound desc = could not find container \"1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155\": container with ID starting with 1057740ea43891e8af12777a7703db25e70a5f39769deab38cb6986c133d1155 not found: ID does not exist" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.667582 4731 scope.go:117] "RemoveContainer" containerID="44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69" Dec 03 19:15:45 crc kubenswrapper[4731]: E1203 19:15:45.668528 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69\": container with ID starting with 44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69 not found: ID does not exist" containerID="44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.668567 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69"} err="failed to get container status \"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69\": rpc error: code = NotFound desc = could not find container \"44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69\": container with ID starting with 44434f8d0f383f42f653c862465fa08fa1c23bb25bfaf0cb4460da0963923a69 not found: ID does not exist" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.877892 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.893611 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.906384 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:45 crc kubenswrapper[4731]: E1203 19:15:45.906913 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.906933 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" Dec 03 19:15:45 crc kubenswrapper[4731]: E1203 19:15:45.906947 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a4b141-8350-4132-8173-04c0dc9cc328" containerName="nova-manage" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.906957 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a4b141-8350-4132-8173-04c0dc9cc328" containerName="nova-manage" Dec 03 19:15:45 crc kubenswrapper[4731]: E1203 19:15:45.906984 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.906991 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.907176 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-metadata" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.907213 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" containerName="nova-metadata-log" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.907837 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a4b141-8350-4132-8173-04c0dc9cc328" containerName="nova-manage" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.909186 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.914128 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.924069 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.948022 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ad6da-321d-4263-a203-8d9f47b1ab43-logs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.948191 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.948219 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.948313 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gc4\" (UniqueName: \"kubernetes.io/projected/c46ad6da-321d-4263-a203-8d9f47b1ab43-kube-api-access-n5gc4\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.948364 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-config-data\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:45 crc kubenswrapper[4731]: I1203 19:15:45.960997 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.050557 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ad6da-321d-4263-a203-8d9f47b1ab43-logs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.050677 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.051592 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.051078 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ad6da-321d-4263-a203-8d9f47b1ab43-logs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.051678 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5gc4\" (UniqueName: \"kubernetes.io/projected/c46ad6da-321d-4263-a203-8d9f47b1ab43-kube-api-access-n5gc4\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.051862 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-config-data\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.055434 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.056373 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-config-data\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.058548 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c46ad6da-321d-4263-a203-8d9f47b1ab43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.071308 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5gc4\" (UniqueName: \"kubernetes.io/projected/c46ad6da-321d-4263-a203-8d9f47b1ab43-kube-api-access-n5gc4\") pod \"nova-metadata-0\" (UID: \"c46ad6da-321d-4263-a203-8d9f47b1ab43\") " pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.305256 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 19:15:46 crc kubenswrapper[4731]: I1203 19:15:46.809488 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.397864 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.489066 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxncs\" (UniqueName: \"kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs\") pod \"32385ead-eb98-476c-89e5-5d61b693d178\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.489628 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle\") pod \"32385ead-eb98-476c-89e5-5d61b693d178\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.489719 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data\") pod \"32385ead-eb98-476c-89e5-5d61b693d178\" (UID: \"32385ead-eb98-476c-89e5-5d61b693d178\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.498809 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs" (OuterVolumeSpecName: "kube-api-access-mxncs") pod "32385ead-eb98-476c-89e5-5d61b693d178" (UID: "32385ead-eb98-476c-89e5-5d61b693d178"). InnerVolumeSpecName "kube-api-access-mxncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.526699 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data" (OuterVolumeSpecName: "config-data") pod "32385ead-eb98-476c-89e5-5d61b693d178" (UID: "32385ead-eb98-476c-89e5-5d61b693d178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.526861 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32385ead-eb98-476c-89e5-5d61b693d178" (UID: "32385ead-eb98-476c-89e5-5d61b693d178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.530211 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.565849 4731 generic.go:334] "Generic (PLEG): container finished" podID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerID="d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b" exitCode=0 Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.565929 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerDied","Data":"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.565967 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa2821e4-50b8-412c-9520-8b596fffe57b","Type":"ContainerDied","Data":"c4e29d31f28dabc377397d80861561502864df619d91f93ddac6afa7475c526f"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.565986 4731 scope.go:117] "RemoveContainer" containerID="d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.566174 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.583016 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c46ad6da-321d-4263-a203-8d9f47b1ab43","Type":"ContainerStarted","Data":"5346bae8c814743df23c9db40c8f88c1c38265fa1c12b5d2338270e032eac54b"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.583074 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c46ad6da-321d-4263-a203-8d9f47b1ab43","Type":"ContainerStarted","Data":"c4288f07b62c52d785947f311885b3ff025595f4d8db33811348b924b25018fe"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.583086 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c46ad6da-321d-4263-a203-8d9f47b1ab43","Type":"ContainerStarted","Data":"ea6f91e74cc3b0337687efc0f92add6af04c5e8b2e3e0d3692aada3dbfebc4c7"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.586275 4731 generic.go:334] "Generic (PLEG): container finished" podID="32385ead-eb98-476c-89e5-5d61b693d178" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" exitCode=0 Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.586315 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32385ead-eb98-476c-89e5-5d61b693d178","Type":"ContainerDied","Data":"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.586947 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.587588 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32385ead-eb98-476c-89e5-5d61b693d178","Type":"ContainerDied","Data":"c99671e8eb896dd6afd462368158a8d2073de1b17055ca1d3308190f9df38e26"} Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592200 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592260 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592381 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np428\" (UniqueName: \"kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592426 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592560 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.592602 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle\") pod \"aa2821e4-50b8-412c-9520-8b596fffe57b\" (UID: \"aa2821e4-50b8-412c-9520-8b596fffe57b\") " Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.593203 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxncs\" (UniqueName: \"kubernetes.io/projected/32385ead-eb98-476c-89e5-5d61b693d178-kube-api-access-mxncs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.593223 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.593233 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32385ead-eb98-476c-89e5-5d61b693d178-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.593846 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs" (OuterVolumeSpecName: "logs") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.599084 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428" (OuterVolumeSpecName: "kube-api-access-np428") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "kube-api-access-np428". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.621560 4731 scope.go:117] "RemoveContainer" containerID="d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.633317 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.633291043 podStartE2EDuration="2.633291043s" podCreationTimestamp="2025-12-03 19:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:47.602213606 +0000 UTC m=+1268.200808080" watchObservedRunningTime="2025-12-03 19:15:47.633291043 +0000 UTC m=+1268.231885507" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.642087 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data" (OuterVolumeSpecName: "config-data") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.663651 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.663833 4731 scope.go:117] "RemoveContainer" containerID="d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b" Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.664979 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b\": container with ID starting with d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b not found: ID does not exist" containerID="d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.665040 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b"} err="failed to get container status \"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b\": rpc error: code = NotFound desc = could not find container \"d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b\": container with ID starting with d3837c159d53690c89f98bdf5ec785b15c6cdaf5e568ba4129de67d9b970591b not found: ID does not exist" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.665068 4731 scope.go:117] "RemoveContainer" containerID="d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952" Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.665393 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952\": container with ID starting with d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952 not found: ID does not exist" containerID="d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.665425 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952"} err="failed to get container status \"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952\": rpc error: code = NotFound desc = could not find container \"d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952\": container with ID starting with d6d8e49a634abf641a8752ed7cc1e0a5d1fbbfde9756e181f5bd1ef1d5b9b952 not found: ID does not exist" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.665443 4731 scope.go:117] "RemoveContainer" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.682893 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.691336 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.692415 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aa2821e4-50b8-412c-9520-8b596fffe57b" (UID: "aa2821e4-50b8-412c-9520-8b596fffe57b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.692598 4731 scope.go:117] "RemoveContainer" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.692973 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9\": container with ID starting with 5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9 not found: ID does not exist" containerID="5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.693004 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9"} err="failed to get container status \"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9\": rpc error: code = NotFound desc = could not find container \"5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9\": container with ID starting with 5f0d224fd9f0e75ca70f833d993016897508cbebd0f5460859524856bd0543b9 not found: ID does not exist" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695497 4731 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695533 4731 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695567 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695588 4731 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa2821e4-50b8-412c-9520-8b596fffe57b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695601 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np428\" (UniqueName: \"kubernetes.io/projected/aa2821e4-50b8-412c-9520-8b596fffe57b-kube-api-access-np428\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.695612 4731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa2821e4-50b8-412c-9520-8b596fffe57b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.751428 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.762097 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.762811 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-log" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.762834 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-log" Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.762857 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-api" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.762865 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-api" Dec 03 19:15:47 crc kubenswrapper[4731]: E1203 19:15:47.762901 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32385ead-eb98-476c-89e5-5d61b693d178" containerName="nova-scheduler-scheduler" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.762908 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="32385ead-eb98-476c-89e5-5d61b693d178" containerName="nova-scheduler-scheduler" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.763132 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="32385ead-eb98-476c-89e5-5d61b693d178" containerName="nova-scheduler-scheduler" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.763162 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-log" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.763177 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" containerName="nova-api-api" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.764928 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.769161 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.773168 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.797465 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvhvw\" (UniqueName: \"kubernetes.io/projected/fe568c47-889f-4487-a5e1-9cd479fd0145-kube-api-access-wvhvw\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.797528 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-config-data\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.797596 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.867153 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32385ead-eb98-476c-89e5-5d61b693d178" path="/var/lib/kubelet/pods/32385ead-eb98-476c-89e5-5d61b693d178/volumes" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.867771 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9b5335-b617-430e-bd5b-0dbbac80d148" path="/var/lib/kubelet/pods/5a9b5335-b617-430e-bd5b-0dbbac80d148/volumes" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.900416 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.900965 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvhvw\" (UniqueName: \"kubernetes.io/projected/fe568c47-889f-4487-a5e1-9cd479fd0145-kube-api-access-wvhvw\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.901191 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-config-data\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.908314 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.908374 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe568c47-889f-4487-a5e1-9cd479fd0145-config-data\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.914528 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.928294 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.928491 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvhvw\" (UniqueName: \"kubernetes.io/projected/fe568c47-889f-4487-a5e1-9cd479fd0145-kube-api-access-wvhvw\") pod \"nova-scheduler-0\" (UID: \"fe568c47-889f-4487-a5e1-9cd479fd0145\") " pod="openstack/nova-scheduler-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.943042 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.945045 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.947098 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.947551 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.947731 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 19:15:47 crc kubenswrapper[4731]: I1203 19:15:47.960779 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003386 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003455 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-config-data\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003642 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-public-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003750 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgj92\" (UniqueName: \"kubernetes.io/projected/2abf2726-a215-4e18-a1f8-a05cdda42b69-kube-api-access-wgj92\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003800 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.003823 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2abf2726-a215-4e18-a1f8-a05cdda42b69-logs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.089838 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105214 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105258 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2abf2726-a215-4e18-a1f8-a05cdda42b69-logs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105345 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105387 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-config-data\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105500 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-public-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.105531 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgj92\" (UniqueName: \"kubernetes.io/projected/2abf2726-a215-4e18-a1f8-a05cdda42b69-kube-api-access-wgj92\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.106613 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2abf2726-a215-4e18-a1f8-a05cdda42b69-logs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.110205 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.110699 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-public-tls-certs\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.112005 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-config-data\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.115083 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abf2726-a215-4e18-a1f8-a05cdda42b69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.131738 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgj92\" (UniqueName: \"kubernetes.io/projected/2abf2726-a215-4e18-a1f8-a05cdda42b69-kube-api-access-wgj92\") pod \"nova-api-0\" (UID: \"2abf2726-a215-4e18-a1f8-a05cdda42b69\") " pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.262652 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.614828 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 19:15:48 crc kubenswrapper[4731]: W1203 19:15:48.615326 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe568c47_889f_4487_a5e1_9cd479fd0145.slice/crio-f7d435917386e66c4f69811d9e046b334cb07603ded5502fb97fea8577a4c8aa WatchSource:0}: Error finding container f7d435917386e66c4f69811d9e046b334cb07603ded5502fb97fea8577a4c8aa: Status 404 returned error can't find the container with id f7d435917386e66c4f69811d9e046b334cb07603ded5502fb97fea8577a4c8aa Dec 03 19:15:48 crc kubenswrapper[4731]: I1203 19:15:48.788646 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 19:15:48 crc kubenswrapper[4731]: W1203 19:15:48.798131 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abf2726_a215_4e18_a1f8_a05cdda42b69.slice/crio-f36cd440e0a397c66bb157204c62274f990136c96705ddec73734f02f91075ce WatchSource:0}: Error finding container f36cd440e0a397c66bb157204c62274f990136c96705ddec73734f02f91075ce: Status 404 returned error can't find the container with id f36cd440e0a397c66bb157204c62274f990136c96705ddec73734f02f91075ce Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.621396 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2abf2726-a215-4e18-a1f8-a05cdda42b69","Type":"ContainerStarted","Data":"29da041dffe292dc7a1a4cf2b915e27ff47cc65f03d0a2c5a9867ef75ba7200e"} Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.622028 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2abf2726-a215-4e18-a1f8-a05cdda42b69","Type":"ContainerStarted","Data":"a81f0c3f03a37b523bee86ae7266b452f171b0822ec2b2e54f8882f09451601c"} Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.622045 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2abf2726-a215-4e18-a1f8-a05cdda42b69","Type":"ContainerStarted","Data":"f36cd440e0a397c66bb157204c62274f990136c96705ddec73734f02f91075ce"} Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.624022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe568c47-889f-4487-a5e1-9cd479fd0145","Type":"ContainerStarted","Data":"801732a47a520c2ed20cbf1fa72b36753a380a2a92cee3bf631c0c976fd8a616"} Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.624089 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe568c47-889f-4487-a5e1-9cd479fd0145","Type":"ContainerStarted","Data":"f7d435917386e66c4f69811d9e046b334cb07603ded5502fb97fea8577a4c8aa"} Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.674456 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674428876 podStartE2EDuration="2.674428876s" podCreationTimestamp="2025-12-03 19:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:49.667215739 +0000 UTC m=+1270.265810223" watchObservedRunningTime="2025-12-03 19:15:49.674428876 +0000 UTC m=+1270.273023340" Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.678150 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.678129202 podStartE2EDuration="2.678129202s" podCreationTimestamp="2025-12-03 19:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:15:49.650312647 +0000 UTC m=+1270.248907121" watchObservedRunningTime="2025-12-03 19:15:49.678129202 +0000 UTC m=+1270.276723666" Dec 03 19:15:49 crc kubenswrapper[4731]: I1203 19:15:49.876892 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2821e4-50b8-412c-9520-8b596fffe57b" path="/var/lib/kubelet/pods/aa2821e4-50b8-412c-9520-8b596fffe57b/volumes" Dec 03 19:15:51 crc kubenswrapper[4731]: I1203 19:15:51.311713 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:15:51 crc kubenswrapper[4731]: I1203 19:15:51.311799 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 19:15:53 crc kubenswrapper[4731]: I1203 19:15:53.090118 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 19:15:56 crc kubenswrapper[4731]: I1203 19:15:56.306093 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 19:15:56 crc kubenswrapper[4731]: I1203 19:15:56.306972 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 19:15:56 crc kubenswrapper[4731]: I1203 19:15:56.469471 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:15:56 crc kubenswrapper[4731]: I1203 19:15:56.469564 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:15:57 crc kubenswrapper[4731]: I1203 19:15:57.325551 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c46ad6da-321d-4263-a203-8d9f47b1ab43" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:57 crc kubenswrapper[4731]: I1203 19:15:57.325562 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c46ad6da-321d-4263-a203-8d9f47b1ab43" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:58 crc kubenswrapper[4731]: I1203 19:15:58.090808 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 19:15:58 crc kubenswrapper[4731]: I1203 19:15:58.117186 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 19:15:58 crc kubenswrapper[4731]: I1203 19:15:58.264510 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:58 crc kubenswrapper[4731]: I1203 19:15:58.264963 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 19:15:58 crc kubenswrapper[4731]: I1203 19:15:58.789025 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 19:15:59 crc kubenswrapper[4731]: I1203 19:15:59.280435 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2abf2726-a215-4e18-a1f8-a05cdda42b69" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.192:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:15:59 crc kubenswrapper[4731]: I1203 19:15:59.281050 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2abf2726-a215-4e18-a1f8-a05cdda42b69" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.192:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 19:16:01 crc kubenswrapper[4731]: I1203 19:16:01.770309 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 19:16:06 crc kubenswrapper[4731]: I1203 19:16:06.313834 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 19:16:06 crc kubenswrapper[4731]: I1203 19:16:06.314740 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 19:16:06 crc kubenswrapper[4731]: I1203 19:16:06.323560 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 19:16:06 crc kubenswrapper[4731]: I1203 19:16:06.323686 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.276780 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.279289 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.281448 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.324187 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.827823 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 19:16:08 crc kubenswrapper[4731]: I1203 19:16:08.836344 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 19:16:17 crc kubenswrapper[4731]: I1203 19:16:17.012870 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:18 crc kubenswrapper[4731]: I1203 19:16:18.220179 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:21 crc kubenswrapper[4731]: I1203 19:16:21.746200 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="rabbitmq" containerID="cri-o://6c86f97930867d7e96e7d64a75e8e0731c393d4dc8cbab33295a96de1806f905" gracePeriod=604796 Dec 03 19:16:22 crc kubenswrapper[4731]: I1203 19:16:22.697707 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="rabbitmq" containerID="cri-o://98b24cea655e157dd1c4c37f068e846e36262777c12548713701f7a9b89f8f01" gracePeriod=604796 Dec 03 19:16:24 crc kubenswrapper[4731]: I1203 19:16:24.142693 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Dec 03 19:16:24 crc kubenswrapper[4731]: I1203 19:16:24.456729 4731 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Dec 03 19:16:26 crc kubenswrapper[4731]: I1203 19:16:26.469283 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:16:26 crc kubenswrapper[4731]: I1203 19:16:26.470470 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:16:26 crc kubenswrapper[4731]: I1203 19:16:26.470598 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:16:26 crc kubenswrapper[4731]: I1203 19:16:26.471490 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:16:26 crc kubenswrapper[4731]: I1203 19:16:26.471624 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0" gracePeriod=600 Dec 03 19:16:27 crc kubenswrapper[4731]: I1203 19:16:27.019193 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0" exitCode=0 Dec 03 19:16:27 crc kubenswrapper[4731]: I1203 19:16:27.019682 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0"} Dec 03 19:16:27 crc kubenswrapper[4731]: I1203 19:16:27.019721 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808"} Dec 03 19:16:27 crc kubenswrapper[4731]: I1203 19:16:27.019742 4731 scope.go:117] "RemoveContainer" containerID="4e59b557ae762b84b60c06ac0b9fadc27bee96a8b13a95e3f34bd03098de4d47" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.041353 4731 generic.go:334] "Generic (PLEG): container finished" podID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerID="6c86f97930867d7e96e7d64a75e8e0731c393d4dc8cbab33295a96de1806f905" exitCode=0 Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.041440 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerDied","Data":"6c86f97930867d7e96e7d64a75e8e0731c393d4dc8cbab33295a96de1806f905"} Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.421518 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537010 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537484 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537633 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fm4g\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537768 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537876 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.537961 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538040 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538128 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538177 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538208 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538335 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538499 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data\") pod \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\" (UID: \"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b\") " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.538792 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.539158 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.539492 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.539511 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.539522 4731 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.545838 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.546419 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.548705 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g" (OuterVolumeSpecName: "kube-api-access-5fm4g") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "kube-api-access-5fm4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.552858 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info" (OuterVolumeSpecName: "pod-info") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.556590 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.587025 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data" (OuterVolumeSpecName: "config-data") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.589825 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf" (OuterVolumeSpecName: "server-conf") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640720 4731 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640773 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640784 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640795 4731 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640806 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640816 4731 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.640826 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fm4g\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-kube-api-access-5fm4g\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.663929 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.667673 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" (UID: "e48a33e6-5a6e-4eab-b4c1-df141fb4b00b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.742490 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:28 crc kubenswrapper[4731]: I1203 19:16:28.742536 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.069744 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e48a33e6-5a6e-4eab-b4c1-df141fb4b00b","Type":"ContainerDied","Data":"1f3b041aeeab1b3901a03578b28b9d16a47457ab118073db9a879c43c20cc1a5"} Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.069870 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.070153 4731 scope.go:117] "RemoveContainer" containerID="6c86f97930867d7e96e7d64a75e8e0731c393d4dc8cbab33295a96de1806f905" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.073573 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerDied","Data":"98b24cea655e157dd1c4c37f068e846e36262777c12548713701f7a9b89f8f01"} Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.073414 4731 generic.go:334] "Generic (PLEG): container finished" podID="469480bc-e167-4ecc-87c4-9691057d999f" containerID="98b24cea655e157dd1c4c37f068e846e36262777c12548713701f7a9b89f8f01" exitCode=0 Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.121702 4731 scope.go:117] "RemoveContainer" containerID="fc32c944255cdac00d14416c11b5771f9a9f1e781738b3debf50039e4d9a17fd" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.139473 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.154602 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.164336 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:29 crc kubenswrapper[4731]: E1203 19:16:29.165066 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="rabbitmq" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.165088 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="rabbitmq" Dec 03 19:16:29 crc kubenswrapper[4731]: E1203 19:16:29.165104 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="setup-container" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.165112 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="setup-container" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.165335 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" containerName="rabbitmq" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.166476 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.175574 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.179544 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.179896 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.180085 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.180303 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.180463 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.180584 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2rjh6" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.206356 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.358544 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359019 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359130 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359134 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359251 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359323 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359387 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359422 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2baac998-a8f6-4902-a641-5b9229c9dd2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359463 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359494 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359569 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2llxv\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-kube-api-access-2llxv\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.359631 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2baac998-a8f6-4902-a641-5b9229c9dd2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.460749 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.460862 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.460910 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.460945 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.460976 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461018 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461052 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461080 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461124 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461186 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461240 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzj2\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2\") pod \"469480bc-e167-4ecc-87c4-9691057d999f\" (UID: \"469480bc-e167-4ecc-87c4-9691057d999f\") " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461564 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461598 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461629 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461649 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2baac998-a8f6-4902-a641-5b9229c9dd2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461671 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461688 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461725 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2llxv\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-kube-api-access-2llxv\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461751 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2baac998-a8f6-4902-a641-5b9229c9dd2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461785 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461811 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.461867 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.462745 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.470075 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.474220 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.477679 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.478031 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.484825 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2baac998-a8f6-4902-a641-5b9229c9dd2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.487148 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.498629 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.502363 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.534329 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2llxv\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-kube-api-access-2llxv\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.538107 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.538352 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info" (OuterVolumeSpecName: "pod-info") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.545026 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2baac998-a8f6-4902-a641-5b9229c9dd2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.547144 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.550963 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2baac998-a8f6-4902-a641-5b9229c9dd2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.551496 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2" (OuterVolumeSpecName: "kube-api-access-pdzj2") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "kube-api-access-pdzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.552359 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.565689 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567264 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2baac998-a8f6-4902-a641-5b9229c9dd2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567531 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567544 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzj2\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-kube-api-access-pdzj2\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567569 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567579 4731 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469480bc-e167-4ecc-87c4-9691057d999f-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567589 4731 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469480bc-e167-4ecc-87c4-9691057d999f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567599 4731 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567608 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.567619 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.605885 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data" (OuterVolumeSpecName: "config-data") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.665784 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.668034 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2baac998-a8f6-4902-a641-5b9229c9dd2f\") " pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.677010 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.677065 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.737948 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf" (OuterVolumeSpecName: "server-conf") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.746297 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "469480bc-e167-4ecc-87c4-9691057d999f" (UID: "469480bc-e167-4ecc-87c4-9691057d999f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.778622 4731 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469480bc-e167-4ecc-87c4-9691057d999f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.778724 4731 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469480bc-e167-4ecc-87c4-9691057d999f-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.823788 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 19:16:29 crc kubenswrapper[4731]: I1203 19:16:29.869002 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48a33e6-5a6e-4eab-b4c1-df141fb4b00b" path="/var/lib/kubelet/pods/e48a33e6-5a6e-4eab-b4c1-df141fb4b00b/volumes" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.091552 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469480bc-e167-4ecc-87c4-9691057d999f","Type":"ContainerDied","Data":"24178e0a31d94ef5ea9772de56e020cdbc29116d7a7c895ba997a0a7209b4476"} Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.092011 4731 scope.go:117] "RemoveContainer" containerID="98b24cea655e157dd1c4c37f068e846e36262777c12548713701f7a9b89f8f01" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.092174 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.128186 4731 scope.go:117] "RemoveContainer" containerID="58682561d8a5f5c7c84ae4f7bf28b58db8754a15808e79e7e8013b7ec94685b2" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.130115 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.152332 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.174326 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: E1203 19:16:30.174866 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="setup-container" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.174886 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="setup-container" Dec 03 19:16:30 crc kubenswrapper[4731]: E1203 19:16:30.174933 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="rabbitmq" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.174941 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="rabbitmq" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.175169 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="469480bc-e167-4ecc-87c4-9691057d999f" containerName="rabbitmq" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.176710 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.180428 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.180875 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.181010 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.181132 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.181357 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5qvkc" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.181852 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.186581 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.198342 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.287886 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5c78742-a693-4329-956e-96662dfcb374-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.287955 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5c78742-a693-4329-956e-96662dfcb374-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288020 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288045 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288072 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288091 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288133 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6zcs\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-kube-api-access-v6zcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288163 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288200 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288222 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.288250 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.357243 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: W1203 19:16:30.368552 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2baac998_a8f6_4902_a641_5b9229c9dd2f.slice/crio-95ac8d0f10eecb3e8d56b9f09fcf080b056131a9495500c2dc30f087ecbe5b6d WatchSource:0}: Error finding container 95ac8d0f10eecb3e8d56b9f09fcf080b056131a9495500c2dc30f087ecbe5b6d: Status 404 returned error can't find the container with id 95ac8d0f10eecb3e8d56b9f09fcf080b056131a9495500c2dc30f087ecbe5b6d Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.389697 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390116 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390145 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390147 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390859 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390165 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.390912 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391056 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6zcs\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-kube-api-access-v6zcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391091 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391141 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391212 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391245 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391895 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.391978 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5c78742-a693-4329-956e-96662dfcb374-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.392029 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5c78742-a693-4329-956e-96662dfcb374-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.392762 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.395575 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5c78742-a693-4329-956e-96662dfcb374-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.396703 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.397120 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5c78742-a693-4329-956e-96662dfcb374-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.399412 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.405547 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5c78742-a693-4329-956e-96662dfcb374-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.408129 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6zcs\" (UniqueName: \"kubernetes.io/projected/f5c78742-a693-4329-956e-96662dfcb374-kube-api-access-v6zcs\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.430061 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5c78742-a693-4329-956e-96662dfcb374\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.500166 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.823228 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.826241 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.829552 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.836289 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907375 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907441 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907485 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907509 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907554 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48kd\" (UniqueName: \"kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.907652 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:30 crc kubenswrapper[4731]: I1203 19:16:30.956976 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 19:16:30 crc kubenswrapper[4731]: W1203 19:16:30.958426 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c78742_a693_4329_956e_96662dfcb374.slice/crio-d1f59b5fd0b5d9578f8137faa89916a5205fc9a97e90f63fa3232acd93598f59 WatchSource:0}: Error finding container d1f59b5fd0b5d9578f8137faa89916a5205fc9a97e90f63fa3232acd93598f59: Status 404 returned error can't find the container with id d1f59b5fd0b5d9578f8137faa89916a5205fc9a97e90f63fa3232acd93598f59 Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.009998 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.010097 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.010125 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.010158 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.010185 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.010227 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48kd\" (UniqueName: \"kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.011208 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.011240 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.011207 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.011520 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.011842 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.035266 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48kd\" (UniqueName: \"kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd\") pod \"dnsmasq-dns-6785449bd9-glm7v\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.102375 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2baac998-a8f6-4902-a641-5b9229c9dd2f","Type":"ContainerStarted","Data":"95ac8d0f10eecb3e8d56b9f09fcf080b056131a9495500c2dc30f087ecbe5b6d"} Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.104831 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5c78742-a693-4329-956e-96662dfcb374","Type":"ContainerStarted","Data":"d1f59b5fd0b5d9578f8137faa89916a5205fc9a97e90f63fa3232acd93598f59"} Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.153662 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.693166 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:31 crc kubenswrapper[4731]: W1203 19:16:31.699090 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9280bd1_c182_4bdd_93f0_f0d692dcd844.slice/crio-ca1d2a93a1cf7a3490c5337569a4a669bcd1d20b6cebfaef9aa514365324fb51 WatchSource:0}: Error finding container ca1d2a93a1cf7a3490c5337569a4a669bcd1d20b6cebfaef9aa514365324fb51: Status 404 returned error can't find the container with id ca1d2a93a1cf7a3490c5337569a4a669bcd1d20b6cebfaef9aa514365324fb51 Dec 03 19:16:31 crc kubenswrapper[4731]: I1203 19:16:31.869762 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469480bc-e167-4ecc-87c4-9691057d999f" path="/var/lib/kubelet/pods/469480bc-e167-4ecc-87c4-9691057d999f/volumes" Dec 03 19:16:32 crc kubenswrapper[4731]: I1203 19:16:32.124621 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerStarted","Data":"eb069211a86ab513da2cc5ca835f1e029f8629549a9fe31ddb636669bae42660"} Dec 03 19:16:32 crc kubenswrapper[4731]: I1203 19:16:32.124681 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerStarted","Data":"ca1d2a93a1cf7a3490c5337569a4a669bcd1d20b6cebfaef9aa514365324fb51"} Dec 03 19:16:33 crc kubenswrapper[4731]: I1203 19:16:33.138998 4731 generic.go:334] "Generic (PLEG): container finished" podID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerID="eb069211a86ab513da2cc5ca835f1e029f8629549a9fe31ddb636669bae42660" exitCode=0 Dec 03 19:16:33 crc kubenswrapper[4731]: I1203 19:16:33.139301 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerDied","Data":"eb069211a86ab513da2cc5ca835f1e029f8629549a9fe31ddb636669bae42660"} Dec 03 19:16:33 crc kubenswrapper[4731]: I1203 19:16:33.142603 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2baac998-a8f6-4902-a641-5b9229c9dd2f","Type":"ContainerStarted","Data":"54adb8ab52ad66033986463a53c477e72f03cff96dc16d9f94f15058df67a6ec"} Dec 03 19:16:33 crc kubenswrapper[4731]: I1203 19:16:33.145871 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5c78742-a693-4329-956e-96662dfcb374","Type":"ContainerStarted","Data":"03884ee2ca1ea0f9f567287816c23fe9ae0978ac5d154cd31c724f56fda4b3d8"} Dec 03 19:16:34 crc kubenswrapper[4731]: I1203 19:16:34.159605 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerStarted","Data":"17229dd0cfeab1204ed5d5bb35191e4bac3b9e5653c070ea63b3ac76a8a602c8"} Dec 03 19:16:34 crc kubenswrapper[4731]: I1203 19:16:34.160326 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:34 crc kubenswrapper[4731]: I1203 19:16:34.200610 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" podStartSLOduration=4.200575663 podStartE2EDuration="4.200575663s" podCreationTimestamp="2025-12-03 19:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:16:34.186993696 +0000 UTC m=+1314.785588160" watchObservedRunningTime="2025-12-03 19:16:34.200575663 +0000 UTC m=+1314.799170157" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.155603 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.301456 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.301881 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759c88d79f-424sd" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="dnsmasq-dns" containerID="cri-o://7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a" gracePeriod=10 Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.443228 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f469589c7-hz6wc"] Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.445461 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.455262 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f469589c7-hz6wc"] Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.564908 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-config\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.565907 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-sb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.566091 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-openstack-edpm-ipam\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.566923 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqfx\" (UniqueName: \"kubernetes.io/projected/2995333d-35da-4bd5-a503-e998d4311219-kube-api-access-6kqfx\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.567044 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-nb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.567164 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-dns-swift-storage-0\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.669756 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqfx\" (UniqueName: \"kubernetes.io/projected/2995333d-35da-4bd5-a503-e998d4311219-kube-api-access-6kqfx\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.669815 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-nb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.669854 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-dns-swift-storage-0\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.669948 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-config\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.669973 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-sb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.670003 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-openstack-edpm-ipam\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.671495 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-dns-swift-storage-0\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.672195 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-nb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.672366 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-openstack-edpm-ipam\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.672456 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-config\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.672632 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2995333d-35da-4bd5-a503-e998d4311219-ovsdbserver-sb\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.699339 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqfx\" (UniqueName: \"kubernetes.io/projected/2995333d-35da-4bd5-a503-e998d4311219-kube-api-access-6kqfx\") pod \"dnsmasq-dns-f469589c7-hz6wc\" (UID: \"2995333d-35da-4bd5-a503-e998d4311219\") " pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.793693 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.795683 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.975366 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb\") pod \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.975804 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc28s\" (UniqueName: \"kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s\") pod \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.976052 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config\") pod \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.976108 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0\") pod \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.976208 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb\") pod \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\" (UID: \"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa\") " Dec 03 19:16:41 crc kubenswrapper[4731]: I1203 19:16:41.983535 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s" (OuterVolumeSpecName: "kube-api-access-rc28s") pod "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" (UID: "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa"). InnerVolumeSpecName "kube-api-access-rc28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.026849 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" (UID: "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.032270 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" (UID: "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.035197 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" (UID: "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.047194 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config" (OuterVolumeSpecName: "config") pod "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" (UID: "25d0a493-a0ec-4cdf-a367-3a60aab1ccfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.078766 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.078810 4731 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.078823 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.078832 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.078844 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc28s\" (UniqueName: \"kubernetes.io/projected/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa-kube-api-access-rc28s\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.270659 4731 generic.go:334] "Generic (PLEG): container finished" podID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerID="7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a" exitCode=0 Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.270724 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759c88d79f-424sd" event={"ID":"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa","Type":"ContainerDied","Data":"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a"} Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.270767 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759c88d79f-424sd" event={"ID":"25d0a493-a0ec-4cdf-a367-3a60aab1ccfa","Type":"ContainerDied","Data":"9b5a8c680d72e088e67020368b830e55941ed567de223b2dd2e114b35f813457"} Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.270792 4731 scope.go:117] "RemoveContainer" containerID="7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.270996 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759c88d79f-424sd" Dec 03 19:16:42 crc kubenswrapper[4731]: W1203 19:16:42.304583 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2995333d_35da_4bd5_a503_e998d4311219.slice/crio-81b0da6736471f5c6f4ed052ea9a680eeafe71406ac7e7973119615459fb05e0 WatchSource:0}: Error finding container 81b0da6736471f5c6f4ed052ea9a680eeafe71406ac7e7973119615459fb05e0: Status 404 returned error can't find the container with id 81b0da6736471f5c6f4ed052ea9a680eeafe71406ac7e7973119615459fb05e0 Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.309847 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f469589c7-hz6wc"] Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.324631 4731 scope.go:117] "RemoveContainer" containerID="4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.338290 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.349435 4731 scope.go:117] "RemoveContainer" containerID="7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a" Dec 03 19:16:42 crc kubenswrapper[4731]: E1203 19:16:42.350022 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a\": container with ID starting with 7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a not found: ID does not exist" containerID="7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.350086 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a"} err="failed to get container status \"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a\": rpc error: code = NotFound desc = could not find container \"7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a\": container with ID starting with 7fb0716defb1f8b370443a4bd58d7649b63cfb13724af90a2c1a79c84c5b032a not found: ID does not exist" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.350129 4731 scope.go:117] "RemoveContainer" containerID="4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b" Dec 03 19:16:42 crc kubenswrapper[4731]: E1203 19:16:42.350547 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b\": container with ID starting with 4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b not found: ID does not exist" containerID="4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.350598 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b"} err="failed to get container status \"4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b\": rpc error: code = NotFound desc = could not find container \"4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b\": container with ID starting with 4f5f0e790abc22cffc2299aef2fb4905d966a6f1a13f5727a4617021361bbe3b not found: ID does not exist" Dec 03 19:16:42 crc kubenswrapper[4731]: I1203 19:16:42.352921 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759c88d79f-424sd"] Dec 03 19:16:43 crc kubenswrapper[4731]: I1203 19:16:43.284756 4731 generic.go:334] "Generic (PLEG): container finished" podID="2995333d-35da-4bd5-a503-e998d4311219" containerID="c5ce174eb02b79538ad32bbce36128ee451380919f2cb6ad79c6e47798eb9f69" exitCode=0 Dec 03 19:16:43 crc kubenswrapper[4731]: I1203 19:16:43.284820 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" event={"ID":"2995333d-35da-4bd5-a503-e998d4311219","Type":"ContainerDied","Data":"c5ce174eb02b79538ad32bbce36128ee451380919f2cb6ad79c6e47798eb9f69"} Dec 03 19:16:43 crc kubenswrapper[4731]: I1203 19:16:43.284875 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" event={"ID":"2995333d-35da-4bd5-a503-e998d4311219","Type":"ContainerStarted","Data":"81b0da6736471f5c6f4ed052ea9a680eeafe71406ac7e7973119615459fb05e0"} Dec 03 19:16:43 crc kubenswrapper[4731]: I1203 19:16:43.929167 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" path="/var/lib/kubelet/pods/25d0a493-a0ec-4cdf-a367-3a60aab1ccfa/volumes" Dec 03 19:16:44 crc kubenswrapper[4731]: I1203 19:16:44.299283 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" event={"ID":"2995333d-35da-4bd5-a503-e998d4311219","Type":"ContainerStarted","Data":"dd095ca49795323c1a657deec79f58a0fe9182937ccc0471afc67146cd7127f1"} Dec 03 19:16:44 crc kubenswrapper[4731]: I1203 19:16:44.300822 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:44 crc kubenswrapper[4731]: I1203 19:16:44.334451 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" podStartSLOduration=3.334418268 podStartE2EDuration="3.334418268s" podCreationTimestamp="2025-12-03 19:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:16:44.319874502 +0000 UTC m=+1324.918468976" watchObservedRunningTime="2025-12-03 19:16:44.334418268 +0000 UTC m=+1324.933012752" Dec 03 19:16:51 crc kubenswrapper[4731]: I1203 19:16:51.795591 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f469589c7-hz6wc" Dec 03 19:16:51 crc kubenswrapper[4731]: I1203 19:16:51.883642 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:51 crc kubenswrapper[4731]: I1203 19:16:51.883981 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="dnsmasq-dns" containerID="cri-o://17229dd0cfeab1204ed5d5bb35191e4bac3b9e5653c070ea63b3ac76a8a602c8" gracePeriod=10 Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.404667 4731 generic.go:334] "Generic (PLEG): container finished" podID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerID="17229dd0cfeab1204ed5d5bb35191e4bac3b9e5653c070ea63b3ac76a8a602c8" exitCode=0 Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.405209 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerDied","Data":"17229dd0cfeab1204ed5d5bb35191e4bac3b9e5653c070ea63b3ac76a8a602c8"} Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.477837 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.646413 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.646518 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.646595 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.646756 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.647509 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r48kd\" (UniqueName: \"kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.647578 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb\") pod \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\" (UID: \"d9280bd1-c182-4bdd-93f0-f0d692dcd844\") " Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.653434 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd" (OuterVolumeSpecName: "kube-api-access-r48kd") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "kube-api-access-r48kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.698372 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.713385 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.716404 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.721874 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config" (OuterVolumeSpecName: "config") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.723945 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9280bd1-c182-4bdd-93f0-f0d692dcd844" (UID: "d9280bd1-c182-4bdd-93f0-f0d692dcd844"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750200 4731 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750232 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750241 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750250 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r48kd\" (UniqueName: \"kubernetes.io/projected/d9280bd1-c182-4bdd-93f0-f0d692dcd844-kube-api-access-r48kd\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750275 4731 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:52 crc kubenswrapper[4731]: I1203 19:16:52.750284 4731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9280bd1-c182-4bdd-93f0-f0d692dcd844-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.417429 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" event={"ID":"d9280bd1-c182-4bdd-93f0-f0d692dcd844","Type":"ContainerDied","Data":"ca1d2a93a1cf7a3490c5337569a4a669bcd1d20b6cebfaef9aa514365324fb51"} Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.417543 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6785449bd9-glm7v" Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.417819 4731 scope.go:117] "RemoveContainer" containerID="17229dd0cfeab1204ed5d5bb35191e4bac3b9e5653c070ea63b3ac76a8a602c8" Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.453033 4731 scope.go:117] "RemoveContainer" containerID="eb069211a86ab513da2cc5ca835f1e029f8629549a9fe31ddb636669bae42660" Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.459351 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.467310 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6785449bd9-glm7v"] Dec 03 19:16:53 crc kubenswrapper[4731]: I1203 19:16:53.869756 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" path="/var/lib/kubelet/pods/d9280bd1-c182-4bdd-93f0-f0d692dcd844/volumes" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.286425 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb"] Dec 03 19:17:05 crc kubenswrapper[4731]: E1203 19:17:05.287395 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287415 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: E1203 19:17:05.287435 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287443 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: E1203 19:17:05.287456 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="init" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287463 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="init" Dec 03 19:17:05 crc kubenswrapper[4731]: E1203 19:17:05.287496 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="init" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287524 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="init" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287773 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9280bd1-c182-4bdd-93f0-f0d692dcd844" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.287802 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d0a493-a0ec-4cdf-a367-3a60aab1ccfa" containerName="dnsmasq-dns" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.288717 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.291111 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.291388 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.291497 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.292102 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.317125 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kjl\" (UniqueName: \"kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.317452 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.317845 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.317954 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.320190 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb"] Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.419201 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.419312 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.419373 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kjl\" (UniqueName: \"kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.419436 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.425890 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.426638 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.430188 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.437835 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kjl\" (UniqueName: \"kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.543437 4731 generic.go:334] "Generic (PLEG): container finished" podID="2baac998-a8f6-4902-a641-5b9229c9dd2f" containerID="54adb8ab52ad66033986463a53c477e72f03cff96dc16d9f94f15058df67a6ec" exitCode=0 Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.543488 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2baac998-a8f6-4902-a641-5b9229c9dd2f","Type":"ContainerDied","Data":"54adb8ab52ad66033986463a53c477e72f03cff96dc16d9f94f15058df67a6ec"} Dec 03 19:17:05 crc kubenswrapper[4731]: I1203 19:17:05.613712 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.165247 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb"] Dec 03 19:17:06 crc kubenswrapper[4731]: W1203 19:17:06.170410 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3384f02c_5b8e_4711_9595_0c62bb7fe7d4.slice/crio-d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f WatchSource:0}: Error finding container d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f: Status 404 returned error can't find the container with id d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.553487 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" event={"ID":"3384f02c-5b8e-4711-9595-0c62bb7fe7d4","Type":"ContainerStarted","Data":"d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f"} Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.555642 4731 generic.go:334] "Generic (PLEG): container finished" podID="f5c78742-a693-4329-956e-96662dfcb374" containerID="03884ee2ca1ea0f9f567287816c23fe9ae0978ac5d154cd31c724f56fda4b3d8" exitCode=0 Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.555699 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5c78742-a693-4329-956e-96662dfcb374","Type":"ContainerDied","Data":"03884ee2ca1ea0f9f567287816c23fe9ae0978ac5d154cd31c724f56fda4b3d8"} Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.557457 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2baac998-a8f6-4902-a641-5b9229c9dd2f","Type":"ContainerStarted","Data":"5932f7b273e6c13e515f9fe6de7abe1443b5674faa47bea28498505563944c4e"} Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.557706 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 19:17:06 crc kubenswrapper[4731]: I1203 19:17:06.614606 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.614576508 podStartE2EDuration="37.614576508s" podCreationTimestamp="2025-12-03 19:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:17:06.606727092 +0000 UTC m=+1347.205321576" watchObservedRunningTime="2025-12-03 19:17:06.614576508 +0000 UTC m=+1347.213170982" Dec 03 19:17:07 crc kubenswrapper[4731]: I1203 19:17:07.575620 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5c78742-a693-4329-956e-96662dfcb374","Type":"ContainerStarted","Data":"cdcc0cc3e16344995599ba49c4dec72bce7db574cb3d3603674817041ac17f95"} Dec 03 19:17:07 crc kubenswrapper[4731]: I1203 19:17:07.576386 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:17:07 crc kubenswrapper[4731]: I1203 19:17:07.599277 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.599240631 podStartE2EDuration="37.599240631s" podCreationTimestamp="2025-12-03 19:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 19:17:07.597707203 +0000 UTC m=+1348.196301667" watchObservedRunningTime="2025-12-03 19:17:07.599240631 +0000 UTC m=+1348.197835095" Dec 03 19:17:16 crc kubenswrapper[4731]: I1203 19:17:16.687465 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" event={"ID":"3384f02c-5b8e-4711-9595-0c62bb7fe7d4","Type":"ContainerStarted","Data":"99ad58bad5bf7848662beea9e42a033cd68f06bcce17477009a1d3dae1c68609"} Dec 03 19:17:16 crc kubenswrapper[4731]: I1203 19:17:16.722822 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" podStartSLOduration=2.379020378 podStartE2EDuration="11.722791958s" podCreationTimestamp="2025-12-03 19:17:05 +0000 UTC" firstStartedPulling="2025-12-03 19:17:06.172423499 +0000 UTC m=+1346.771017963" lastFinishedPulling="2025-12-03 19:17:15.516195079 +0000 UTC m=+1356.114789543" observedRunningTime="2025-12-03 19:17:16.709779309 +0000 UTC m=+1357.308373783" watchObservedRunningTime="2025-12-03 19:17:16.722791958 +0000 UTC m=+1357.321386442" Dec 03 19:17:19 crc kubenswrapper[4731]: I1203 19:17:19.830622 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 19:17:20 crc kubenswrapper[4731]: I1203 19:17:20.503427 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 19:17:27 crc kubenswrapper[4731]: I1203 19:17:27.807696 4731 generic.go:334] "Generic (PLEG): container finished" podID="3384f02c-5b8e-4711-9595-0c62bb7fe7d4" containerID="99ad58bad5bf7848662beea9e42a033cd68f06bcce17477009a1d3dae1c68609" exitCode=0 Dec 03 19:17:27 crc kubenswrapper[4731]: I1203 19:17:27.807793 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" event={"ID":"3384f02c-5b8e-4711-9595-0c62bb7fe7d4","Type":"ContainerDied","Data":"99ad58bad5bf7848662beea9e42a033cd68f06bcce17477009a1d3dae1c68609"} Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.314546 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.465175 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kjl\" (UniqueName: \"kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl\") pod \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.465270 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key\") pod \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.465347 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory\") pod \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.465503 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle\") pod \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\" (UID: \"3384f02c-5b8e-4711-9595-0c62bb7fe7d4\") " Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.473592 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl" (OuterVolumeSpecName: "kube-api-access-l2kjl") pod "3384f02c-5b8e-4711-9595-0c62bb7fe7d4" (UID: "3384f02c-5b8e-4711-9595-0c62bb7fe7d4"). InnerVolumeSpecName "kube-api-access-l2kjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.473682 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3384f02c-5b8e-4711-9595-0c62bb7fe7d4" (UID: "3384f02c-5b8e-4711-9595-0c62bb7fe7d4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.500351 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3384f02c-5b8e-4711-9595-0c62bb7fe7d4" (UID: "3384f02c-5b8e-4711-9595-0c62bb7fe7d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.514858 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory" (OuterVolumeSpecName: "inventory") pod "3384f02c-5b8e-4711-9595-0c62bb7fe7d4" (UID: "3384f02c-5b8e-4711-9595-0c62bb7fe7d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.568712 4731 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.568958 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kjl\" (UniqueName: \"kubernetes.io/projected/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-kube-api-access-l2kjl\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.569069 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.569182 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3384f02c-5b8e-4711-9595-0c62bb7fe7d4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.832571 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.832572 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb" event={"ID":"3384f02c-5b8e-4711-9595-0c62bb7fe7d4","Type":"ContainerDied","Data":"d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f"} Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.832654 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b5b7465e5e9e4f0994d66963fa98cf3e2cb614a43c4651f302d98f8de4dc7f" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.923595 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj"] Dec 03 19:17:29 crc kubenswrapper[4731]: E1203 19:17:29.925200 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3384f02c-5b8e-4711-9595-0c62bb7fe7d4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.925230 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3384f02c-5b8e-4711-9595-0c62bb7fe7d4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.926269 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3384f02c-5b8e-4711-9595-0c62bb7fe7d4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.927713 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.929846 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.930736 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.930962 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.931107 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:17:29 crc kubenswrapper[4731]: I1203 19:17:29.936811 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj"] Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.079206 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wl8z\" (UniqueName: \"kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.079393 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.079425 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.180834 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.180885 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.180964 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl8z\" (UniqueName: \"kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.190077 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.192121 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.201158 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wl8z\" (UniqueName: \"kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ptwmj\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.254789 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.814800 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj"] Dec 03 19:17:30 crc kubenswrapper[4731]: W1203 19:17:30.816725 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd910b616_8396_482a_8fd9_976f3b1ac4a0.slice/crio-692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6 WatchSource:0}: Error finding container 692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6: Status 404 returned error can't find the container with id 692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6 Dec 03 19:17:30 crc kubenswrapper[4731]: I1203 19:17:30.842948 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" event={"ID":"d910b616-8396-482a-8fd9-976f3b1ac4a0","Type":"ContainerStarted","Data":"692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6"} Dec 03 19:17:31 crc kubenswrapper[4731]: I1203 19:17:31.876583 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" event={"ID":"d910b616-8396-482a-8fd9-976f3b1ac4a0","Type":"ContainerStarted","Data":"d2dc2777bfa6f3e1f1ed33de5a1dd467aebce75fce547610be2e05dc2f141e49"} Dec 03 19:17:31 crc kubenswrapper[4731]: I1203 19:17:31.890387 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" podStartSLOduration=2.4668442 podStartE2EDuration="2.890359197s" podCreationTimestamp="2025-12-03 19:17:29 +0000 UTC" firstStartedPulling="2025-12-03 19:17:30.819630626 +0000 UTC m=+1371.418225080" lastFinishedPulling="2025-12-03 19:17:31.243145613 +0000 UTC m=+1371.841740077" observedRunningTime="2025-12-03 19:17:31.877548996 +0000 UTC m=+1372.476143500" watchObservedRunningTime="2025-12-03 19:17:31.890359197 +0000 UTC m=+1372.488953671" Dec 03 19:17:34 crc kubenswrapper[4731]: I1203 19:17:34.888238 4731 generic.go:334] "Generic (PLEG): container finished" podID="d910b616-8396-482a-8fd9-976f3b1ac4a0" containerID="d2dc2777bfa6f3e1f1ed33de5a1dd467aebce75fce547610be2e05dc2f141e49" exitCode=0 Dec 03 19:17:34 crc kubenswrapper[4731]: I1203 19:17:34.888396 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" event={"ID":"d910b616-8396-482a-8fd9-976f3b1ac4a0","Type":"ContainerDied","Data":"d2dc2777bfa6f3e1f1ed33de5a1dd467aebce75fce547610be2e05dc2f141e49"} Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.298649 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.418349 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key\") pod \"d910b616-8396-482a-8fd9-976f3b1ac4a0\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.418863 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory\") pod \"d910b616-8396-482a-8fd9-976f3b1ac4a0\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.419096 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wl8z\" (UniqueName: \"kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z\") pod \"d910b616-8396-482a-8fd9-976f3b1ac4a0\" (UID: \"d910b616-8396-482a-8fd9-976f3b1ac4a0\") " Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.432644 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z" (OuterVolumeSpecName: "kube-api-access-2wl8z") pod "d910b616-8396-482a-8fd9-976f3b1ac4a0" (UID: "d910b616-8396-482a-8fd9-976f3b1ac4a0"). InnerVolumeSpecName "kube-api-access-2wl8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.447661 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory" (OuterVolumeSpecName: "inventory") pod "d910b616-8396-482a-8fd9-976f3b1ac4a0" (UID: "d910b616-8396-482a-8fd9-976f3b1ac4a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.448341 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d910b616-8396-482a-8fd9-976f3b1ac4a0" (UID: "d910b616-8396-482a-8fd9-976f3b1ac4a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.521415 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.521712 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wl8z\" (UniqueName: \"kubernetes.io/projected/d910b616-8396-482a-8fd9-976f3b1ac4a0-kube-api-access-2wl8z\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.521804 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d910b616-8396-482a-8fd9-976f3b1ac4a0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.908547 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" event={"ID":"d910b616-8396-482a-8fd9-976f3b1ac4a0","Type":"ContainerDied","Data":"692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6"} Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.908602 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692790739ac30450ce005e821c4a0ce9ced2b3e278b2ccdbe9ba0c50897f47d6" Dec 03 19:17:36 crc kubenswrapper[4731]: I1203 19:17:36.908678 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ptwmj" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.026340 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw"] Dec 03 19:17:37 crc kubenswrapper[4731]: E1203 19:17:37.026902 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d910b616-8396-482a-8fd9-976f3b1ac4a0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.026923 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d910b616-8396-482a-8fd9-976f3b1ac4a0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.027185 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d910b616-8396-482a-8fd9-976f3b1ac4a0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.027962 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.033974 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.034462 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.034709 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.035029 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.036614 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw"] Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.133653 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cgn\" (UniqueName: \"kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.133749 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.133816 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.133930 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.236726 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.237302 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.237410 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.237563 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cgn\" (UniqueName: \"kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.243067 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.244633 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.246380 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.262629 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cgn\" (UniqueName: \"kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.349772 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:17:37 crc kubenswrapper[4731]: I1203 19:17:37.939747 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw"] Dec 03 19:17:38 crc kubenswrapper[4731]: I1203 19:17:38.934956 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" event={"ID":"079e5870-590d-4617-b9de-acdae5e59284","Type":"ContainerStarted","Data":"41b7840c8620ce68978eac5ece020203627e8efb9cdc7490df150accfe1653f6"} Dec 03 19:17:38 crc kubenswrapper[4731]: I1203 19:17:38.935733 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" event={"ID":"079e5870-590d-4617-b9de-acdae5e59284","Type":"ContainerStarted","Data":"c822cea4becdfd9a4b1f287e1bc48cb26b66a9ed1d4794356e9adae712bedcea"} Dec 03 19:17:38 crc kubenswrapper[4731]: I1203 19:17:38.966454 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" podStartSLOduration=1.5539128880000002 podStartE2EDuration="1.966426498s" podCreationTimestamp="2025-12-03 19:17:37 +0000 UTC" firstStartedPulling="2025-12-03 19:17:37.933191331 +0000 UTC m=+1378.531785785" lastFinishedPulling="2025-12-03 19:17:38.345704941 +0000 UTC m=+1378.944299395" observedRunningTime="2025-12-03 19:17:38.952462563 +0000 UTC m=+1379.551057027" watchObservedRunningTime="2025-12-03 19:17:38.966426498 +0000 UTC m=+1379.565020962" Dec 03 19:18:17 crc kubenswrapper[4731]: I1203 19:18:17.674523 4731 scope.go:117] "RemoveContainer" containerID="01710f631071c8907e484cd17ba7623a49c9b45ef08b12efd0a650d841207ff8" Dec 03 19:18:26 crc kubenswrapper[4731]: I1203 19:18:26.469063 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:18:26 crc kubenswrapper[4731]: I1203 19:18:26.469891 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:18:56 crc kubenswrapper[4731]: I1203 19:18:56.468683 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:18:56 crc kubenswrapper[4731]: I1203 19:18:56.469303 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.446993 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.480632 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.481236 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.660711 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.661064 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.661126 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.763417 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.763482 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.763545 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.764069 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.764078 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.785811 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb\") pod \"certified-operators-gphs5\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:18:59 crc kubenswrapper[4731]: I1203 19:18:59.810821 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:00 crc kubenswrapper[4731]: I1203 19:19:00.390903 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:19:00 crc kubenswrapper[4731]: I1203 19:19:00.799376 4731 generic.go:334] "Generic (PLEG): container finished" podID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerID="ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e" exitCode=0 Dec 03 19:19:00 crc kubenswrapper[4731]: I1203 19:19:00.799487 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerDied","Data":"ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e"} Dec 03 19:19:00 crc kubenswrapper[4731]: I1203 19:19:00.799802 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerStarted","Data":"5459a0190a4fe4a732c76b0503c2d09f569564cf5d51b1d4f1f4e9fb2dfe3981"} Dec 03 19:19:01 crc kubenswrapper[4731]: I1203 19:19:01.818349 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerStarted","Data":"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c"} Dec 03 19:19:02 crc kubenswrapper[4731]: I1203 19:19:02.830686 4731 generic.go:334] "Generic (PLEG): container finished" podID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerID="d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c" exitCode=0 Dec 03 19:19:02 crc kubenswrapper[4731]: I1203 19:19:02.830906 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerDied","Data":"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c"} Dec 03 19:19:03 crc kubenswrapper[4731]: I1203 19:19:03.842000 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerStarted","Data":"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90"} Dec 03 19:19:03 crc kubenswrapper[4731]: I1203 19:19:03.871245 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gphs5" podStartSLOduration=2.278564788 podStartE2EDuration="4.871214887s" podCreationTimestamp="2025-12-03 19:18:59 +0000 UTC" firstStartedPulling="2025-12-03 19:19:00.801575673 +0000 UTC m=+1461.400170137" lastFinishedPulling="2025-12-03 19:19:03.394225762 +0000 UTC m=+1463.992820236" observedRunningTime="2025-12-03 19:19:03.866993262 +0000 UTC m=+1464.465587726" watchObservedRunningTime="2025-12-03 19:19:03.871214887 +0000 UTC m=+1464.469809361" Dec 03 19:19:09 crc kubenswrapper[4731]: I1203 19:19:09.811790 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:09 crc kubenswrapper[4731]: I1203 19:19:09.812402 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:09 crc kubenswrapper[4731]: I1203 19:19:09.889829 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:09 crc kubenswrapper[4731]: I1203 19:19:09.968399 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:10 crc kubenswrapper[4731]: I1203 19:19:10.134888 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:19:11 crc kubenswrapper[4731]: I1203 19:19:11.924340 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gphs5" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="registry-server" containerID="cri-o://c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90" gracePeriod=2 Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.415271 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.540084 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb\") pod \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.540192 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content\") pod \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.540522 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities\") pod \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\" (UID: \"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3\") " Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.541373 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities" (OuterVolumeSpecName: "utilities") pod "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" (UID: "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.562568 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb" (OuterVolumeSpecName: "kube-api-access-fv6hb") pod "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" (UID: "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3"). InnerVolumeSpecName "kube-api-access-fv6hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.614125 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" (UID: "fa4e4006-5fc6-460a-89fa-534e2e2fd8a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.643907 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.643958 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv6hb\" (UniqueName: \"kubernetes.io/projected/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-kube-api-access-fv6hb\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.643977 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.936322 4731 generic.go:334] "Generic (PLEG): container finished" podID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerID="c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90" exitCode=0 Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.936374 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerDied","Data":"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90"} Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.936411 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gphs5" event={"ID":"fa4e4006-5fc6-460a-89fa-534e2e2fd8a3","Type":"ContainerDied","Data":"5459a0190a4fe4a732c76b0503c2d09f569564cf5d51b1d4f1f4e9fb2dfe3981"} Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.936405 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gphs5" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.936490 4731 scope.go:117] "RemoveContainer" containerID="c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.958787 4731 scope.go:117] "RemoveContainer" containerID="d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c" Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.982070 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.995910 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gphs5"] Dec 03 19:19:12 crc kubenswrapper[4731]: I1203 19:19:12.998274 4731 scope.go:117] "RemoveContainer" containerID="ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.050213 4731 scope.go:117] "RemoveContainer" containerID="c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90" Dec 03 19:19:13 crc kubenswrapper[4731]: E1203 19:19:13.051498 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90\": container with ID starting with c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90 not found: ID does not exist" containerID="c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.051562 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90"} err="failed to get container status \"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90\": rpc error: code = NotFound desc = could not find container \"c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90\": container with ID starting with c22ad6085f149289f1aff8cd07f12c33aea40de9d143aec7bfcb1a67a90eee90 not found: ID does not exist" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.051596 4731 scope.go:117] "RemoveContainer" containerID="d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c" Dec 03 19:19:13 crc kubenswrapper[4731]: E1203 19:19:13.052147 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c\": container with ID starting with d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c not found: ID does not exist" containerID="d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.052175 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c"} err="failed to get container status \"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c\": rpc error: code = NotFound desc = could not find container \"d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c\": container with ID starting with d06baab55afcf1d31d5701becc3e33643be2ed4b53b38ba80a6b6e909d8d7b9c not found: ID does not exist" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.052189 4731 scope.go:117] "RemoveContainer" containerID="ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e" Dec 03 19:19:13 crc kubenswrapper[4731]: E1203 19:19:13.052611 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e\": container with ID starting with ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e not found: ID does not exist" containerID="ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.052679 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e"} err="failed to get container status \"ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e\": rpc error: code = NotFound desc = could not find container \"ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e\": container with ID starting with ecb8d6d679e799d86b454375e5d486b10494956a2b5f842bac0a6bb568ebf41e not found: ID does not exist" Dec 03 19:19:13 crc kubenswrapper[4731]: I1203 19:19:13.866071 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" path="/var/lib/kubelet/pods/fa4e4006-5fc6-460a-89fa-534e2e2fd8a3/volumes" Dec 03 19:19:17 crc kubenswrapper[4731]: I1203 19:19:17.753697 4731 scope.go:117] "RemoveContainer" containerID="bfc7fb4df8b00d9f01dfea5ec173884742f3a1a439ee6fac163282b77b77ada9" Dec 03 19:19:17 crc kubenswrapper[4731]: I1203 19:19:17.809968 4731 scope.go:117] "RemoveContainer" containerID="b824719360d5c598d6d63177940c2fff30207b4b666481a901664d9b35d6dd38" Dec 03 19:19:17 crc kubenswrapper[4731]: I1203 19:19:17.996607 4731 scope.go:117] "RemoveContainer" containerID="33662cd7ba2522f0d0ce52ce5d222e87eabfd59d29dac43f44181809555b2c02" Dec 03 19:19:18 crc kubenswrapper[4731]: I1203 19:19:18.019512 4731 scope.go:117] "RemoveContainer" containerID="82c25f4498741178874aec37b41cdd665e3584ee8f0d119e87a79e877938ac42" Dec 03 19:19:18 crc kubenswrapper[4731]: I1203 19:19:18.048020 4731 scope.go:117] "RemoveContainer" containerID="c799c4598778d6cb537fbff91743aa1254ea976f68d020b92903ed42ca10d4d1" Dec 03 19:19:18 crc kubenswrapper[4731]: I1203 19:19:18.068331 4731 scope.go:117] "RemoveContainer" containerID="cbf678eabf96eed394c8395472bdf57b49dfe6cd318626870fa3106aed7d89b5" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.401871 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:21 crc kubenswrapper[4731]: E1203 19:19:21.403115 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="registry-server" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.403137 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="registry-server" Dec 03 19:19:21 crc kubenswrapper[4731]: E1203 19:19:21.403152 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="extract-utilities" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.403161 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="extract-utilities" Dec 03 19:19:21 crc kubenswrapper[4731]: E1203 19:19:21.403174 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="extract-content" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.403182 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="extract-content" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.403459 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4e4006-5fc6-460a-89fa-534e2e2fd8a3" containerName="registry-server" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.405153 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.414113 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.555967 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9lv\" (UniqueName: \"kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.556137 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.556313 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.658901 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9lv\" (UniqueName: \"kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.659042 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.659162 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.659976 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.660101 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.691668 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9lv\" (UniqueName: \"kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv\") pod \"community-operators-t6sps\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:21 crc kubenswrapper[4731]: I1203 19:19:21.766604 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:22 crc kubenswrapper[4731]: I1203 19:19:22.335434 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:23 crc kubenswrapper[4731]: I1203 19:19:23.057690 4731 generic.go:334] "Generic (PLEG): container finished" podID="44bd2978-c382-4d77-b715-e58fddaff143" containerID="2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7" exitCode=0 Dec 03 19:19:23 crc kubenswrapper[4731]: I1203 19:19:23.057813 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerDied","Data":"2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7"} Dec 03 19:19:23 crc kubenswrapper[4731]: I1203 19:19:23.058142 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerStarted","Data":"ec91b6a80fa6e8e1c1d88720dc5b0644b457d64332de88af73aeb92daeaf0802"} Dec 03 19:19:25 crc kubenswrapper[4731]: I1203 19:19:25.080218 4731 generic.go:334] "Generic (PLEG): container finished" podID="44bd2978-c382-4d77-b715-e58fddaff143" containerID="ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3" exitCode=0 Dec 03 19:19:25 crc kubenswrapper[4731]: I1203 19:19:25.080285 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerDied","Data":"ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3"} Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.104641 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerStarted","Data":"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc"} Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.129428 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t6sps" podStartSLOduration=2.333038197 podStartE2EDuration="5.12939547s" podCreationTimestamp="2025-12-03 19:19:21 +0000 UTC" firstStartedPulling="2025-12-03 19:19:23.061487157 +0000 UTC m=+1483.660081621" lastFinishedPulling="2025-12-03 19:19:25.85784443 +0000 UTC m=+1486.456438894" observedRunningTime="2025-12-03 19:19:26.122837086 +0000 UTC m=+1486.721431580" watchObservedRunningTime="2025-12-03 19:19:26.12939547 +0000 UTC m=+1486.727989934" Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.468533 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.468636 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.468708 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.469528 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:19:26 crc kubenswrapper[4731]: I1203 19:19:26.469611 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" gracePeriod=600 Dec 03 19:19:26 crc kubenswrapper[4731]: E1203 19:19:26.608525 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:19:27 crc kubenswrapper[4731]: I1203 19:19:27.116950 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" exitCode=0 Dec 03 19:19:27 crc kubenswrapper[4731]: I1203 19:19:27.117024 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808"} Dec 03 19:19:27 crc kubenswrapper[4731]: I1203 19:19:27.117309 4731 scope.go:117] "RemoveContainer" containerID="e2f44e072bc88870db26a778f706559cfc353499a5a66a4b1e40841fc6944db0" Dec 03 19:19:27 crc kubenswrapper[4731]: I1203 19:19:27.118318 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:19:27 crc kubenswrapper[4731]: E1203 19:19:27.118626 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:19:31 crc kubenswrapper[4731]: I1203 19:19:31.767457 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:31 crc kubenswrapper[4731]: I1203 19:19:31.769433 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:31 crc kubenswrapper[4731]: I1203 19:19:31.816745 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:32 crc kubenswrapper[4731]: I1203 19:19:32.213247 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:32 crc kubenswrapper[4731]: I1203 19:19:32.280963 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.188532 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t6sps" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="registry-server" containerID="cri-o://0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc" gracePeriod=2 Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.691708 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.735578 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9lv\" (UniqueName: \"kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv\") pod \"44bd2978-c382-4d77-b715-e58fddaff143\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.735735 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content\") pod \"44bd2978-c382-4d77-b715-e58fddaff143\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.735838 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities\") pod \"44bd2978-c382-4d77-b715-e58fddaff143\" (UID: \"44bd2978-c382-4d77-b715-e58fddaff143\") " Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.736908 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities" (OuterVolumeSpecName: "utilities") pod "44bd2978-c382-4d77-b715-e58fddaff143" (UID: "44bd2978-c382-4d77-b715-e58fddaff143"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.745802 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv" (OuterVolumeSpecName: "kube-api-access-wm9lv") pod "44bd2978-c382-4d77-b715-e58fddaff143" (UID: "44bd2978-c382-4d77-b715-e58fddaff143"). InnerVolumeSpecName "kube-api-access-wm9lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.786955 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44bd2978-c382-4d77-b715-e58fddaff143" (UID: "44bd2978-c382-4d77-b715-e58fddaff143"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.838288 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.838327 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bd2978-c382-4d77-b715-e58fddaff143-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:34 crc kubenswrapper[4731]: I1203 19:19:34.838338 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9lv\" (UniqueName: \"kubernetes.io/projected/44bd2978-c382-4d77-b715-e58fddaff143-kube-api-access-wm9lv\") on node \"crc\" DevicePath \"\"" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.198791 4731 generic.go:334] "Generic (PLEG): container finished" podID="44bd2978-c382-4d77-b715-e58fddaff143" containerID="0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc" exitCode=0 Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.198843 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerDied","Data":"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc"} Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.198884 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6sps" event={"ID":"44bd2978-c382-4d77-b715-e58fddaff143","Type":"ContainerDied","Data":"ec91b6a80fa6e8e1c1d88720dc5b0644b457d64332de88af73aeb92daeaf0802"} Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.198908 4731 scope.go:117] "RemoveContainer" containerID="0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.199013 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6sps" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.238225 4731 scope.go:117] "RemoveContainer" containerID="ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.238535 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.252393 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t6sps"] Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.263326 4731 scope.go:117] "RemoveContainer" containerID="2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.317697 4731 scope.go:117] "RemoveContainer" containerID="0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc" Dec 03 19:19:35 crc kubenswrapper[4731]: E1203 19:19:35.318576 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc\": container with ID starting with 0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc not found: ID does not exist" containerID="0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.318653 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc"} err="failed to get container status \"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc\": rpc error: code = NotFound desc = could not find container \"0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc\": container with ID starting with 0b642e78aa38a95a88406a4406b676f0bbeada0a603b2e34c32de07d33cc8fcc not found: ID does not exist" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.318714 4731 scope.go:117] "RemoveContainer" containerID="ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3" Dec 03 19:19:35 crc kubenswrapper[4731]: E1203 19:19:35.319217 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3\": container with ID starting with ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3 not found: ID does not exist" containerID="ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.319274 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3"} err="failed to get container status \"ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3\": rpc error: code = NotFound desc = could not find container \"ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3\": container with ID starting with ee262547f02e9f90b3b42afb3a346ed4cca144396047f8bcb306a7fd248b60c3 not found: ID does not exist" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.319291 4731 scope.go:117] "RemoveContainer" containerID="2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7" Dec 03 19:19:35 crc kubenswrapper[4731]: E1203 19:19:35.319554 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7\": container with ID starting with 2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7 not found: ID does not exist" containerID="2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.319577 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7"} err="failed to get container status \"2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7\": rpc error: code = NotFound desc = could not find container \"2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7\": container with ID starting with 2e0a7781f046f324065548c880bec7ddf7a74979fd26c1a09f674519fe40a6e7 not found: ID does not exist" Dec 03 19:19:35 crc kubenswrapper[4731]: I1203 19:19:35.868553 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bd2978-c382-4d77-b715-e58fddaff143" path="/var/lib/kubelet/pods/44bd2978-c382-4d77-b715-e58fddaff143/volumes" Dec 03 19:19:39 crc kubenswrapper[4731]: I1203 19:19:39.869221 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:19:39 crc kubenswrapper[4731]: E1203 19:19:39.871472 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:19:50 crc kubenswrapper[4731]: I1203 19:19:50.855853 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:19:50 crc kubenswrapper[4731]: E1203 19:19:50.856662 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:20:05 crc kubenswrapper[4731]: I1203 19:20:05.856466 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:20:05 crc kubenswrapper[4731]: E1203 19:20:05.857502 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:20:18 crc kubenswrapper[4731]: I1203 19:20:18.225108 4731 scope.go:117] "RemoveContainer" containerID="205bf3ac547ad1b6a8a2c2acbd2ac9ab4a77f693335ea5ed64330d7579d8ff10" Dec 03 19:20:20 crc kubenswrapper[4731]: I1203 19:20:20.856690 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:20:20 crc kubenswrapper[4731]: E1203 19:20:20.857282 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.317573 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:23 crc kubenswrapper[4731]: E1203 19:20:23.319288 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="registry-server" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.319316 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="registry-server" Dec 03 19:20:23 crc kubenswrapper[4731]: E1203 19:20:23.319391 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="extract-utilities" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.319408 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="extract-utilities" Dec 03 19:20:23 crc kubenswrapper[4731]: E1203 19:20:23.319442 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="extract-content" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.319515 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="extract-content" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.319975 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bd2978-c382-4d77-b715-e58fddaff143" containerName="registry-server" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.323494 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.333283 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.472102 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqpb\" (UniqueName: \"kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.472235 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.472560 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.574765 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqpb\" (UniqueName: \"kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.574894 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.574997 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.575558 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.575558 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.599539 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqpb\" (UniqueName: \"kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb\") pod \"redhat-operators-w24jd\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:23 crc kubenswrapper[4731]: I1203 19:20:23.650752 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:24 crc kubenswrapper[4731]: I1203 19:20:24.136078 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:24 crc kubenswrapper[4731]: I1203 19:20:24.675874 4731 generic.go:334] "Generic (PLEG): container finished" podID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerID="40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb" exitCode=0 Dec 03 19:20:24 crc kubenswrapper[4731]: I1203 19:20:24.676151 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerDied","Data":"40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb"} Dec 03 19:20:24 crc kubenswrapper[4731]: I1203 19:20:24.676186 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerStarted","Data":"033492fdc27d2322ca8fdd1eac9ba5235520b6579599808bbcb0609fde914a0c"} Dec 03 19:20:24 crc kubenswrapper[4731]: I1203 19:20:24.678649 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:20:26 crc kubenswrapper[4731]: I1203 19:20:26.705174 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerStarted","Data":"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf"} Dec 03 19:20:28 crc kubenswrapper[4731]: I1203 19:20:28.728294 4731 generic.go:334] "Generic (PLEG): container finished" podID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerID="63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf" exitCode=0 Dec 03 19:20:28 crc kubenswrapper[4731]: I1203 19:20:28.728426 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerDied","Data":"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf"} Dec 03 19:20:29 crc kubenswrapper[4731]: I1203 19:20:29.743455 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerStarted","Data":"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557"} Dec 03 19:20:29 crc kubenswrapper[4731]: I1203 19:20:29.771211 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w24jd" podStartSLOduration=2.233776901 podStartE2EDuration="6.771189598s" podCreationTimestamp="2025-12-03 19:20:23 +0000 UTC" firstStartedPulling="2025-12-03 19:20:24.678354737 +0000 UTC m=+1545.276949201" lastFinishedPulling="2025-12-03 19:20:29.215767434 +0000 UTC m=+1549.814361898" observedRunningTime="2025-12-03 19:20:29.767930446 +0000 UTC m=+1550.366524930" watchObservedRunningTime="2025-12-03 19:20:29.771189598 +0000 UTC m=+1550.369784062" Dec 03 19:20:30 crc kubenswrapper[4731]: I1203 19:20:30.930147 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:30 crc kubenswrapper[4731]: I1203 19:20:30.933401 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.001389 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.046954 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.047360 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.047509 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnd82\" (UniqueName: \"kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.149812 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.150202 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnd82\" (UniqueName: \"kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.150239 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.150372 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.150912 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.174811 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnd82\" (UniqueName: \"kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82\") pod \"redhat-marketplace-8mwvn\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.257314 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:31 crc kubenswrapper[4731]: W1203 19:20:31.822790 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e63fff_bdfb_4e6e_921b_ca2dbbcde1e6.slice/crio-120c6892242d3096b1905651e8f1d53efefbcc45a7625fe0535f222f3c0d7dfe WatchSource:0}: Error finding container 120c6892242d3096b1905651e8f1d53efefbcc45a7625fe0535f222f3c0d7dfe: Status 404 returned error can't find the container with id 120c6892242d3096b1905651e8f1d53efefbcc45a7625fe0535f222f3c0d7dfe Dec 03 19:20:31 crc kubenswrapper[4731]: I1203 19:20:31.825879 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:32 crc kubenswrapper[4731]: I1203 19:20:32.769610 4731 generic.go:334] "Generic (PLEG): container finished" podID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerID="0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161" exitCode=0 Dec 03 19:20:32 crc kubenswrapper[4731]: I1203 19:20:32.769673 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerDied","Data":"0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161"} Dec 03 19:20:32 crc kubenswrapper[4731]: I1203 19:20:32.769966 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerStarted","Data":"120c6892242d3096b1905651e8f1d53efefbcc45a7625fe0535f222f3c0d7dfe"} Dec 03 19:20:33 crc kubenswrapper[4731]: I1203 19:20:33.651747 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:33 crc kubenswrapper[4731]: I1203 19:20:33.652620 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:33 crc kubenswrapper[4731]: I1203 19:20:33.781205 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerStarted","Data":"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f"} Dec 03 19:20:34 crc kubenswrapper[4731]: I1203 19:20:34.709048 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w24jd" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="registry-server" probeResult="failure" output=< Dec 03 19:20:34 crc kubenswrapper[4731]: timeout: failed to connect service ":50051" within 1s Dec 03 19:20:34 crc kubenswrapper[4731]: > Dec 03 19:20:34 crc kubenswrapper[4731]: I1203 19:20:34.792047 4731 generic.go:334] "Generic (PLEG): container finished" podID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerID="d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f" exitCode=0 Dec 03 19:20:34 crc kubenswrapper[4731]: I1203 19:20:34.793177 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerDied","Data":"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f"} Dec 03 19:20:34 crc kubenswrapper[4731]: I1203 19:20:34.856602 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:20:34 crc kubenswrapper[4731]: E1203 19:20:34.857200 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:20:35 crc kubenswrapper[4731]: I1203 19:20:35.834378 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerStarted","Data":"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0"} Dec 03 19:20:35 crc kubenswrapper[4731]: I1203 19:20:35.860719 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mwvn" podStartSLOduration=3.423528522 podStartE2EDuration="5.860699589s" podCreationTimestamp="2025-12-03 19:20:30 +0000 UTC" firstStartedPulling="2025-12-03 19:20:32.771935274 +0000 UTC m=+1553.370529738" lastFinishedPulling="2025-12-03 19:20:35.209106341 +0000 UTC m=+1555.807700805" observedRunningTime="2025-12-03 19:20:35.859724528 +0000 UTC m=+1556.458319012" watchObservedRunningTime="2025-12-03 19:20:35.860699589 +0000 UTC m=+1556.459294053" Dec 03 19:20:41 crc kubenswrapper[4731]: I1203 19:20:41.258151 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:41 crc kubenswrapper[4731]: I1203 19:20:41.258766 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:41 crc kubenswrapper[4731]: I1203 19:20:41.316749 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:41 crc kubenswrapper[4731]: I1203 19:20:41.939965 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:41 crc kubenswrapper[4731]: I1203 19:20:41.997610 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:43 crc kubenswrapper[4731]: I1203 19:20:43.706572 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:43 crc kubenswrapper[4731]: I1203 19:20:43.757480 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:43 crc kubenswrapper[4731]: I1203 19:20:43.912957 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mwvn" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="registry-server" containerID="cri-o://a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0" gracePeriod=2 Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.462695 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.523991 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content\") pod \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.524063 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnd82\" (UniqueName: \"kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82\") pod \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.524239 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities\") pod \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\" (UID: \"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6\") " Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.525473 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities" (OuterVolumeSpecName: "utilities") pod "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" (UID: "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.531226 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82" (OuterVolumeSpecName: "kube-api-access-rnd82") pod "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" (UID: "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6"). InnerVolumeSpecName "kube-api-access-rnd82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.556197 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" (UID: "c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.626642 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.626705 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnd82\" (UniqueName: \"kubernetes.io/projected/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-kube-api-access-rnd82\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.626719 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.922958 4731 generic.go:334] "Generic (PLEG): container finished" podID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerID="a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0" exitCode=0 Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.923008 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerDied","Data":"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0"} Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.923041 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mwvn" event={"ID":"c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6","Type":"ContainerDied","Data":"120c6892242d3096b1905651e8f1d53efefbcc45a7625fe0535f222f3c0d7dfe"} Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.923059 4731 scope.go:117] "RemoveContainer" containerID="a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.923194 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mwvn" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.956786 4731 scope.go:117] "RemoveContainer" containerID="d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f" Dec 03 19:20:44 crc kubenswrapper[4731]: I1203 19:20:44.998615 4731 scope.go:117] "RemoveContainer" containerID="0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.041002 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.051955 4731 scope.go:117] "RemoveContainer" containerID="a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0" Dec 03 19:20:45 crc kubenswrapper[4731]: E1203 19:20:45.054831 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0\": container with ID starting with a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0 not found: ID does not exist" containerID="a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.054874 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0"} err="failed to get container status \"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0\": rpc error: code = NotFound desc = could not find container \"a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0\": container with ID starting with a223aca216845101f8765930a81efad098379fff0902ba340215fb98cfefa7b0 not found: ID does not exist" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.054900 4731 scope.go:117] "RemoveContainer" containerID="d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f" Dec 03 19:20:45 crc kubenswrapper[4731]: E1203 19:20:45.055969 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f\": container with ID starting with d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f not found: ID does not exist" containerID="d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.056025 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f"} err="failed to get container status \"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f\": rpc error: code = NotFound desc = could not find container \"d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f\": container with ID starting with d2ce5ad11efdc24630d3869e2ef4c73b148db91ecfaca2e66be63e9bff1de20f not found: ID does not exist" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.056060 4731 scope.go:117] "RemoveContainer" containerID="0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161" Dec 03 19:20:45 crc kubenswrapper[4731]: E1203 19:20:45.056617 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161\": container with ID starting with 0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161 not found: ID does not exist" containerID="0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.056677 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161"} err="failed to get container status \"0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161\": rpc error: code = NotFound desc = could not find container \"0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161\": container with ID starting with 0c06d2b3c5d02508eea645e58aff76ee962d3bedfea22abdf62be61fb18f3161 not found: ID does not exist" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.058641 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mwvn"] Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.390114 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.390690 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w24jd" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="registry-server" containerID="cri-o://cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557" gracePeriod=2 Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.872377 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" path="/var/lib/kubelet/pods/c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6/volumes" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.884433 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.937171 4731 generic.go:334] "Generic (PLEG): container finished" podID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerID="cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557" exitCode=0 Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.937215 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w24jd" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.937271 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerDied","Data":"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557"} Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.937337 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w24jd" event={"ID":"6d2ed87a-7ed6-4431-9259-75af9cd03bb6","Type":"ContainerDied","Data":"033492fdc27d2322ca8fdd1eac9ba5235520b6579599808bbcb0609fde914a0c"} Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.937368 4731 scope.go:117] "RemoveContainer" containerID="cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.958206 4731 scope.go:117] "RemoveContainer" containerID="63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf" Dec 03 19:20:45 crc kubenswrapper[4731]: I1203 19:20:45.995070 4731 scope.go:117] "RemoveContainer" containerID="40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.039034 4731 scope.go:117] "RemoveContainer" containerID="cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557" Dec 03 19:20:46 crc kubenswrapper[4731]: E1203 19:20:46.039511 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557\": container with ID starting with cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557 not found: ID does not exist" containerID="cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.039562 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557"} err="failed to get container status \"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557\": rpc error: code = NotFound desc = could not find container \"cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557\": container with ID starting with cab6045ec654b61375abb502447ac0f34809eca3678bbf2439d71dae9491f557 not found: ID does not exist" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.039588 4731 scope.go:117] "RemoveContainer" containerID="63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf" Dec 03 19:20:46 crc kubenswrapper[4731]: E1203 19:20:46.040110 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf\": container with ID starting with 63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf not found: ID does not exist" containerID="63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.040169 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf"} err="failed to get container status \"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf\": rpc error: code = NotFound desc = could not find container \"63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf\": container with ID starting with 63a03caeed95d7c0e4c8638675e21aca739c1bbc016675b34147cecbb366aadf not found: ID does not exist" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.040208 4731 scope.go:117] "RemoveContainer" containerID="40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb" Dec 03 19:20:46 crc kubenswrapper[4731]: E1203 19:20:46.040869 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb\": container with ID starting with 40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb not found: ID does not exist" containerID="40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.040895 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb"} err="failed to get container status \"40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb\": rpc error: code = NotFound desc = could not find container \"40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb\": container with ID starting with 40d818df7e11f2057372d0dd92616a86be27b2fc91a30474413d87af54088ecb not found: ID does not exist" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.054977 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content\") pod \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.055283 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities\") pod \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.055327 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqpb\" (UniqueName: \"kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb\") pod \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\" (UID: \"6d2ed87a-7ed6-4431-9259-75af9cd03bb6\") " Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.055976 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities" (OuterVolumeSpecName: "utilities") pod "6d2ed87a-7ed6-4431-9259-75af9cd03bb6" (UID: "6d2ed87a-7ed6-4431-9259-75af9cd03bb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.061557 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb" (OuterVolumeSpecName: "kube-api-access-5sqpb") pod "6d2ed87a-7ed6-4431-9259-75af9cd03bb6" (UID: "6d2ed87a-7ed6-4431-9259-75af9cd03bb6"). InnerVolumeSpecName "kube-api-access-5sqpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.158575 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.158625 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sqpb\" (UniqueName: \"kubernetes.io/projected/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-kube-api-access-5sqpb\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.161446 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d2ed87a-7ed6-4431-9259-75af9cd03bb6" (UID: "6d2ed87a-7ed6-4431-9259-75af9cd03bb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.261151 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2ed87a-7ed6-4431-9259-75af9cd03bb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.272456 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:46 crc kubenswrapper[4731]: I1203 19:20:46.283487 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w24jd"] Dec 03 19:20:47 crc kubenswrapper[4731]: I1203 19:20:47.873489 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" path="/var/lib/kubelet/pods/6d2ed87a-7ed6-4431-9259-75af9cd03bb6/volumes" Dec 03 19:20:48 crc kubenswrapper[4731]: I1203 19:20:48.856063 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:20:48 crc kubenswrapper[4731]: E1203 19:20:48.856722 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:20:52 crc kubenswrapper[4731]: I1203 19:20:52.001434 4731 generic.go:334] "Generic (PLEG): container finished" podID="079e5870-590d-4617-b9de-acdae5e59284" containerID="41b7840c8620ce68978eac5ece020203627e8efb9cdc7490df150accfe1653f6" exitCode=0 Dec 03 19:20:52 crc kubenswrapper[4731]: I1203 19:20:52.001512 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" event={"ID":"079e5870-590d-4617-b9de-acdae5e59284","Type":"ContainerDied","Data":"41b7840c8620ce68978eac5ece020203627e8efb9cdc7490df150accfe1653f6"} Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.497628 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.612054 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory\") pod \"079e5870-590d-4617-b9de-acdae5e59284\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.623545 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key\") pod \"079e5870-590d-4617-b9de-acdae5e59284\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.623857 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cgn\" (UniqueName: \"kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn\") pod \"079e5870-590d-4617-b9de-acdae5e59284\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.624375 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle\") pod \"079e5870-590d-4617-b9de-acdae5e59284\" (UID: \"079e5870-590d-4617-b9de-acdae5e59284\") " Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.632750 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn" (OuterVolumeSpecName: "kube-api-access-d8cgn") pod "079e5870-590d-4617-b9de-acdae5e59284" (UID: "079e5870-590d-4617-b9de-acdae5e59284"). InnerVolumeSpecName "kube-api-access-d8cgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.632844 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "079e5870-590d-4617-b9de-acdae5e59284" (UID: "079e5870-590d-4617-b9de-acdae5e59284"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.647520 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory" (OuterVolumeSpecName: "inventory") pod "079e5870-590d-4617-b9de-acdae5e59284" (UID: "079e5870-590d-4617-b9de-acdae5e59284"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.653810 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "079e5870-590d-4617-b9de-acdae5e59284" (UID: "079e5870-590d-4617-b9de-acdae5e59284"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.726980 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.727013 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.727023 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cgn\" (UniqueName: \"kubernetes.io/projected/079e5870-590d-4617-b9de-acdae5e59284-kube-api-access-d8cgn\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:53 crc kubenswrapper[4731]: I1203 19:20:53.727034 4731 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079e5870-590d-4617-b9de-acdae5e59284-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.020709 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" event={"ID":"079e5870-590d-4617-b9de-acdae5e59284","Type":"ContainerDied","Data":"c822cea4becdfd9a4b1f287e1bc48cb26b66a9ed1d4794356e9adae712bedcea"} Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.020760 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c822cea4becdfd9a4b1f287e1bc48cb26b66a9ed1d4794356e9adae712bedcea" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.020843 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175396 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw"] Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175820 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="extract-utilities" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175837 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="extract-utilities" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175851 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175858 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175874 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079e5870-590d-4617-b9de-acdae5e59284" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175882 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="079e5870-590d-4617-b9de-acdae5e59284" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175898 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="extract-content" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175904 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="extract-content" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175912 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="extract-utilities" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175919 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="extract-utilities" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175936 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="extract-content" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175942 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="extract-content" Dec 03 19:20:54 crc kubenswrapper[4731]: E1203 19:20:54.175953 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.175960 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.176167 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e63fff-bdfb-4e6e-921b-ca2dbbcde1e6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.176185 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2ed87a-7ed6-4431-9259-75af9cd03bb6" containerName="registry-server" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.176204 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="079e5870-590d-4617-b9de-acdae5e59284" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.176902 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.181494 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.181984 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.182981 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.183096 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.193705 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw"] Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.342425 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.342909 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss57k\" (UniqueName: \"kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.343059 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.444875 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.445377 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss57k\" (UniqueName: \"kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.445555 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.451650 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.455982 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.464478 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss57k\" (UniqueName: \"kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:54 crc kubenswrapper[4731]: I1203 19:20:54.499356 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:20:55 crc kubenswrapper[4731]: I1203 19:20:55.078783 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw"] Dec 03 19:20:56 crc kubenswrapper[4731]: I1203 19:20:56.040441 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" event={"ID":"e27c61a5-7955-4d79-81a5-6f12322579c0","Type":"ContainerStarted","Data":"fbbee2837d5f05a44b7852440104b09669ec2698631faf1c60f62c538ea49ba7"} Dec 03 19:20:56 crc kubenswrapper[4731]: I1203 19:20:56.040789 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" event={"ID":"e27c61a5-7955-4d79-81a5-6f12322579c0","Type":"ContainerStarted","Data":"7befbe1f97b55b31dea8268e275d71de240416d7f085bf6123e8cf6c6e88c939"} Dec 03 19:20:56 crc kubenswrapper[4731]: I1203 19:20:56.069323 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" podStartSLOduration=1.5359117759999998 podStartE2EDuration="2.06928878s" podCreationTimestamp="2025-12-03 19:20:54 +0000 UTC" firstStartedPulling="2025-12-03 19:20:55.082990665 +0000 UTC m=+1575.681585119" lastFinishedPulling="2025-12-03 19:20:55.616367649 +0000 UTC m=+1576.214962123" observedRunningTime="2025-12-03 19:20:56.057395397 +0000 UTC m=+1576.655989901" watchObservedRunningTime="2025-12-03 19:20:56.06928878 +0000 UTC m=+1576.667883284" Dec 03 19:20:59 crc kubenswrapper[4731]: I1203 19:20:59.878605 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:20:59 crc kubenswrapper[4731]: E1203 19:20:59.879227 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:21:13 crc kubenswrapper[4731]: I1203 19:21:13.856295 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:21:13 crc kubenswrapper[4731]: E1203 19:21:13.857042 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:21:26 crc kubenswrapper[4731]: I1203 19:21:26.856588 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:21:26 crc kubenswrapper[4731]: E1203 19:21:26.857389 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:21:39 crc kubenswrapper[4731]: I1203 19:21:39.882744 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:21:39 crc kubenswrapper[4731]: E1203 19:21:39.889570 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:21:41 crc kubenswrapper[4731]: I1203 19:21:41.046655 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tmrzz"] Dec 03 19:21:41 crc kubenswrapper[4731]: I1203 19:21:41.059782 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tmrzz"] Dec 03 19:21:41 crc kubenswrapper[4731]: I1203 19:21:41.872843 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef" path="/var/lib/kubelet/pods/949b2b1c-46fc-4740-9a9e-0f1e88d0d7ef/volumes" Dec 03 19:21:42 crc kubenswrapper[4731]: I1203 19:21:42.034697 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pbxzw"] Dec 03 19:21:42 crc kubenswrapper[4731]: I1203 19:21:42.046709 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d6b0-account-create-update-vkkvz"] Dec 03 19:21:42 crc kubenswrapper[4731]: I1203 19:21:42.058030 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pbxzw"] Dec 03 19:21:42 crc kubenswrapper[4731]: I1203 19:21:42.067050 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d6b0-account-create-update-vkkvz"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.044023 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xbdsh"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.057960 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6f66-account-create-update-8242w"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.067709 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2324-account-create-update-ll8nj"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.076718 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xbdsh"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.085628 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6f66-account-create-update-8242w"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.093323 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2324-account-create-update-ll8nj"] Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.871495 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2497e019-571b-42b7-bec0-f1ea5d26c3e9" path="/var/lib/kubelet/pods/2497e019-571b-42b7-bec0-f1ea5d26c3e9/volumes" Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.872793 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe6a778-abea-4237-bdcb-2a9c5a8c2967" path="/var/lib/kubelet/pods/4fe6a778-abea-4237-bdcb-2a9c5a8c2967/volumes" Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.874025 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68165469-3496-4274-9d9b-f397ed17e5c9" path="/var/lib/kubelet/pods/68165469-3496-4274-9d9b-f397ed17e5c9/volumes" Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.875210 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75c482c-bd09-47ec-b8e9-0f114cd19ccd" path="/var/lib/kubelet/pods/c75c482c-bd09-47ec-b8e9-0f114cd19ccd/volumes" Dec 03 19:21:43 crc kubenswrapper[4731]: I1203 19:21:43.876872 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f" path="/var/lib/kubelet/pods/e946b6b9-9dfd-43cc-ae7d-b9e6abcecd0f/volumes" Dec 03 19:21:55 crc kubenswrapper[4731]: I1203 19:21:55.856787 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:21:55 crc kubenswrapper[4731]: E1203 19:21:55.857533 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:08 crc kubenswrapper[4731]: I1203 19:22:08.856198 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:22:08 crc kubenswrapper[4731]: E1203 19:22:08.857987 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.052715 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mjcmb"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.062532 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jpsqh"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.071983 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ab2e-account-create-update-h8wtr"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.081603 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mjcmb"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.089692 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jpsqh"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.098408 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k9zst"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.107607 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-69a9-account-create-update-b4j6g"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.116391 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k9zst"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.125345 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6755-account-create-update-cc66g"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.133635 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ab2e-account-create-update-h8wtr"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.142152 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-69a9-account-create-update-b4j6g"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.149375 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6755-account-create-update-cc66g"] Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.874956 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c398f19-6cd0-4998-ba26-d993cbee31e4" path="/var/lib/kubelet/pods/1c398f19-6cd0-4998-ba26-d993cbee31e4/volumes" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.876458 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ced43a-3e30-4722-90ac-7b1184354703" path="/var/lib/kubelet/pods/29ced43a-3e30-4722-90ac-7b1184354703/volumes" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.877690 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edeffd1-c6df-4546-8e0a-419f4ee657e8" path="/var/lib/kubelet/pods/4edeffd1-c6df-4546-8e0a-419f4ee657e8/volumes" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.878772 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3930526-a0aa-4712-bb01-762d53e5cdba" path="/var/lib/kubelet/pods/c3930526-a0aa-4712-bb01-762d53e5cdba/volumes" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.880752 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a23e91-45e1-4e8f-84b5-0787b2198d00" path="/var/lib/kubelet/pods/d3a23e91-45e1-4e8f-84b5-0787b2198d00/volumes" Dec 03 19:22:15 crc kubenswrapper[4731]: I1203 19:22:15.881991 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf45b3a-a5e8-4958-86e7-734c6e3f9092" path="/var/lib/kubelet/pods/dcf45b3a-a5e8-4958-86e7-734c6e3f9092/volumes" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.382319 4731 scope.go:117] "RemoveContainer" containerID="31d2c33ff6fb28b9d0b317081d70932a9826f93ad1c9bde1a42583fb371dc4c4" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.409311 4731 scope.go:117] "RemoveContainer" containerID="3e0605093657eb676597f03fae1fad86fadcbcb2d1b7e42904647a25c6f5779e" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.461406 4731 scope.go:117] "RemoveContainer" containerID="198f03e217ed599555ada706b912c4a919dfebb82dd1c70cd5685b5924e9683d" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.512412 4731 scope.go:117] "RemoveContainer" containerID="2923cad73d61f1b745f3426e39b7df22c4384c9481fb1b24b46ccbdd136bbef5" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.556614 4731 scope.go:117] "RemoveContainer" containerID="2269d158aff36f00d0c2d4666ee3154868b2e8cdb219cbe8690e0b96dbd35671" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.598753 4731 scope.go:117] "RemoveContainer" containerID="24b65bd73fe70e2744072a56a4da18ef1c85cc63e3069337c79bfe7b53813d26" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.643856 4731 scope.go:117] "RemoveContainer" containerID="ec01825e4d86bf21c3a713dacdee94ab0dca8e9523858441854213da22052130" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.665053 4731 scope.go:117] "RemoveContainer" containerID="1f4eb078260b9ab9b528398b168e5a472464724dc91a27d1d038fec15864c4c2" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.709359 4731 scope.go:117] "RemoveContainer" containerID="2acaab0662167e5d49ff9a22f2a9d1ac7ee085001fa9b2703951bce55946bad3" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.733136 4731 scope.go:117] "RemoveContainer" containerID="93704543c38acbaaee5dcfe5ccc6aa5273a735ab6dd5fe3aed3e8434f7e2b842" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.761370 4731 scope.go:117] "RemoveContainer" containerID="4be66175609a3af1ee882ab7e54ac9a01e4e928de5f74fa191131ebabe8fb1a5" Dec 03 19:22:18 crc kubenswrapper[4731]: I1203 19:22:18.788909 4731 scope.go:117] "RemoveContainer" containerID="c762a6d3c3be6c5441edf98e4a0009a8c7f118e6ca51f92f51dbc9a5227464eb" Dec 03 19:22:21 crc kubenswrapper[4731]: I1203 19:22:21.038654 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c7rr9"] Dec 03 19:22:21 crc kubenswrapper[4731]: I1203 19:22:21.049827 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c7rr9"] Dec 03 19:22:21 crc kubenswrapper[4731]: I1203 19:22:21.867007 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9705a938-570d-4016-be7e-60a1ed1ed1cc" path="/var/lib/kubelet/pods/9705a938-570d-4016-be7e-60a1ed1ed1cc/volumes" Dec 03 19:22:22 crc kubenswrapper[4731]: I1203 19:22:22.856493 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:22:22 crc kubenswrapper[4731]: E1203 19:22:22.856810 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:33 crc kubenswrapper[4731]: I1203 19:22:33.856884 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:22:33 crc kubenswrapper[4731]: E1203 19:22:33.857868 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:43 crc kubenswrapper[4731]: I1203 19:22:43.228973 4731 generic.go:334] "Generic (PLEG): container finished" podID="e27c61a5-7955-4d79-81a5-6f12322579c0" containerID="fbbee2837d5f05a44b7852440104b09669ec2698631faf1c60f62c538ea49ba7" exitCode=0 Dec 03 19:22:43 crc kubenswrapper[4731]: I1203 19:22:43.229090 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" event={"ID":"e27c61a5-7955-4d79-81a5-6f12322579c0","Type":"ContainerDied","Data":"fbbee2837d5f05a44b7852440104b09669ec2698631faf1c60f62c538ea49ba7"} Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.688863 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.856249 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:22:44 crc kubenswrapper[4731]: E1203 19:22:44.856936 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.883532 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key\") pod \"e27c61a5-7955-4d79-81a5-6f12322579c0\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.883985 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory\") pod \"e27c61a5-7955-4d79-81a5-6f12322579c0\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.884534 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss57k\" (UniqueName: \"kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k\") pod \"e27c61a5-7955-4d79-81a5-6f12322579c0\" (UID: \"e27c61a5-7955-4d79-81a5-6f12322579c0\") " Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.897751 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k" (OuterVolumeSpecName: "kube-api-access-ss57k") pod "e27c61a5-7955-4d79-81a5-6f12322579c0" (UID: "e27c61a5-7955-4d79-81a5-6f12322579c0"). InnerVolumeSpecName "kube-api-access-ss57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.917010 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e27c61a5-7955-4d79-81a5-6f12322579c0" (UID: "e27c61a5-7955-4d79-81a5-6f12322579c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.936055 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory" (OuterVolumeSpecName: "inventory") pod "e27c61a5-7955-4d79-81a5-6f12322579c0" (UID: "e27c61a5-7955-4d79-81a5-6f12322579c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.986746 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss57k\" (UniqueName: \"kubernetes.io/projected/e27c61a5-7955-4d79-81a5-6f12322579c0-kube-api-access-ss57k\") on node \"crc\" DevicePath \"\"" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.986787 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:22:44 crc kubenswrapper[4731]: I1203 19:22:44.986800 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27c61a5-7955-4d79-81a5-6f12322579c0-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.281875 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" event={"ID":"e27c61a5-7955-4d79-81a5-6f12322579c0","Type":"ContainerDied","Data":"7befbe1f97b55b31dea8268e275d71de240416d7f085bf6123e8cf6c6e88c939"} Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.281942 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7befbe1f97b55b31dea8268e275d71de240416d7f085bf6123e8cf6c6e88c939" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.281961 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.352674 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj"] Dec 03 19:22:45 crc kubenswrapper[4731]: E1203 19:22:45.353354 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27c61a5-7955-4d79-81a5-6f12322579c0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.353376 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27c61a5-7955-4d79-81a5-6f12322579c0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.353609 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27c61a5-7955-4d79-81a5-6f12322579c0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.354403 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.357771 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.357917 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.357955 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.358012 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.382218 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj"] Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.495754 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.496454 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55slf\" (UniqueName: \"kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.496500 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.598843 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.598914 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55slf\" (UniqueName: \"kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.598961 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.604180 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.606995 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.628413 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55slf\" (UniqueName: \"kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:45 crc kubenswrapper[4731]: I1203 19:22:45.675213 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:22:46 crc kubenswrapper[4731]: I1203 19:22:46.812164 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj"] Dec 03 19:22:47 crc kubenswrapper[4731]: I1203 19:22:47.307181 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" event={"ID":"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26","Type":"ContainerStarted","Data":"c00cef6767718820e54aa461af70afaea20943ee4d8cfbeb1c6ba92cdd0f7fd7"} Dec 03 19:22:48 crc kubenswrapper[4731]: I1203 19:22:48.321449 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" event={"ID":"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26","Type":"ContainerStarted","Data":"dad9b9415c859e7ff4b14913e881ecabaa84029e16f5f11fe060292fd8e7f853"} Dec 03 19:22:48 crc kubenswrapper[4731]: I1203 19:22:48.344057 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" podStartSLOduration=2.963516534 podStartE2EDuration="3.344034198s" podCreationTimestamp="2025-12-03 19:22:45 +0000 UTC" firstStartedPulling="2025-12-03 19:22:46.822458076 +0000 UTC m=+1687.421052540" lastFinishedPulling="2025-12-03 19:22:47.20297573 +0000 UTC m=+1687.801570204" observedRunningTime="2025-12-03 19:22:48.341785748 +0000 UTC m=+1688.940380212" watchObservedRunningTime="2025-12-03 19:22:48.344034198 +0000 UTC m=+1688.942628682" Dec 03 19:22:54 crc kubenswrapper[4731]: I1203 19:22:54.064169 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kr6lg"] Dec 03 19:22:54 crc kubenswrapper[4731]: I1203 19:22:54.106520 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s4dgb"] Dec 03 19:22:54 crc kubenswrapper[4731]: I1203 19:22:54.116997 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kr6lg"] Dec 03 19:22:54 crc kubenswrapper[4731]: I1203 19:22:54.125847 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s4dgb"] Dec 03 19:22:55 crc kubenswrapper[4731]: I1203 19:22:55.856121 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:22:55 crc kubenswrapper[4731]: E1203 19:22:55.856823 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:22:55 crc kubenswrapper[4731]: I1203 19:22:55.867643 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af3fc14-8410-4706-957b-2f95a972c64e" path="/var/lib/kubelet/pods/5af3fc14-8410-4706-957b-2f95a972c64e/volumes" Dec 03 19:22:55 crc kubenswrapper[4731]: I1203 19:22:55.868575 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7357e9a7-ce03-47ff-a1a5-55b8d1280d31" path="/var/lib/kubelet/pods/7357e9a7-ce03-47ff-a1a5-55b8d1280d31/volumes" Dec 03 19:23:02 crc kubenswrapper[4731]: I1203 19:23:02.046321 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qbnk6"] Dec 03 19:23:02 crc kubenswrapper[4731]: I1203 19:23:02.056045 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qbnk6"] Dec 03 19:23:03 crc kubenswrapper[4731]: I1203 19:23:03.895225 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb136717-d762-4d86-8b8b-e3a59cadb469" path="/var/lib/kubelet/pods/eb136717-d762-4d86-8b8b-e3a59cadb469/volumes" Dec 03 19:23:07 crc kubenswrapper[4731]: I1203 19:23:07.035132 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-28phz"] Dec 03 19:23:07 crc kubenswrapper[4731]: I1203 19:23:07.043561 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-28phz"] Dec 03 19:23:07 crc kubenswrapper[4731]: I1203 19:23:07.857847 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:23:07 crc kubenswrapper[4731]: E1203 19:23:07.858168 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:23:07 crc kubenswrapper[4731]: I1203 19:23:07.878947 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74dfcf3b-850d-48ea-9188-297579680f01" path="/var/lib/kubelet/pods/74dfcf3b-850d-48ea-9188-297579680f01/volumes" Dec 03 19:23:08 crc kubenswrapper[4731]: I1203 19:23:08.037025 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tx58l"] Dec 03 19:23:08 crc kubenswrapper[4731]: I1203 19:23:08.048314 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tx58l"] Dec 03 19:23:09 crc kubenswrapper[4731]: I1203 19:23:09.867571 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936695f7-54f4-4fa7-8373-75c84337ea1f" path="/var/lib/kubelet/pods/936695f7-54f4-4fa7-8373-75c84337ea1f/volumes" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.022741 4731 scope.go:117] "RemoveContainer" containerID="06a27cf623ef61e09a573f659ea97ad6f23d0841eb0e339d059db48eff9ed51c" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.062926 4731 scope.go:117] "RemoveContainer" containerID="cb6297e3c8a4863ce934262e1f438477f2bc278650540fa7a995eab685f234a0" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.063830 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bhw9w"] Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.074329 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bhw9w"] Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.111739 4731 scope.go:117] "RemoveContainer" containerID="539ddb5b9571670f65f466f759e585daf9a16e518c7bae9e77c50ec7d85a005d" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.166680 4731 scope.go:117] "RemoveContainer" containerID="88104e63aace8073210f9d0e6d5e70acec74c4ff0f6f2fbdb25d75f8c0503dba" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.213403 4731 scope.go:117] "RemoveContainer" containerID="8ffb57c2f3b98c1838f99eeb36c978ef0d034e0d05c43179152a8e68fd6e5972" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.251633 4731 scope.go:117] "RemoveContainer" containerID="c5a7e7747a6e165d39c33b622b7b07df305d27612d046d6e8b4a19635eb321bc" Dec 03 19:23:19 crc kubenswrapper[4731]: I1203 19:23:19.866833 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50dcbd31-ca7a-47bc-831f-e5f5e2da78ee" path="/var/lib/kubelet/pods/50dcbd31-ca7a-47bc-831f-e5f5e2da78ee/volumes" Dec 03 19:23:22 crc kubenswrapper[4731]: I1203 19:23:22.856325 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:23:22 crc kubenswrapper[4731]: E1203 19:23:22.856993 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:23:35 crc kubenswrapper[4731]: I1203 19:23:35.860118 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:23:35 crc kubenswrapper[4731]: E1203 19:23:35.860905 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:23:47 crc kubenswrapper[4731]: I1203 19:23:47.857169 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:23:47 crc kubenswrapper[4731]: E1203 19:23:47.858083 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.045392 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-439b-account-create-update-28tkb"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.055146 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pcjrh"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.064126 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-439b-account-create-update-28tkb"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.073776 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pcjrh"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.082267 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-539b-account-create-update-g5ghg"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.089368 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-s9jrx"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.095903 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xtlvb"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.102600 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xtlvb"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.110053 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-539b-account-create-update-g5ghg"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.117232 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-s9jrx"] Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.871359 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6934159f-6d4b-4243-a5c3-0092fe5f58c4" path="/var/lib/kubelet/pods/6934159f-6d4b-4243-a5c3-0092fe5f58c4/volumes" Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.872636 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8229af75-4935-4593-b482-7c841a710cd9" path="/var/lib/kubelet/pods/8229af75-4935-4593-b482-7c841a710cd9/volumes" Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.873492 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86009e3d-4e12-4660-8104-b71fade8668f" path="/var/lib/kubelet/pods/86009e3d-4e12-4660-8104-b71fade8668f/volumes" Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.874357 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1aa8978-06f7-4e6c-9e3b-3edbf20878bd" path="/var/lib/kubelet/pods/c1aa8978-06f7-4e6c-9e3b-3edbf20878bd/volumes" Dec 03 19:23:57 crc kubenswrapper[4731]: I1203 19:23:57.876370 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2992ad-2bee-42b4-8dcc-4584ffa3b336" path="/var/lib/kubelet/pods/ff2992ad-2bee-42b4-8dcc-4584ffa3b336/volumes" Dec 03 19:23:58 crc kubenswrapper[4731]: I1203 19:23:58.050692 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c53f-account-create-update-vbgt5"] Dec 03 19:23:58 crc kubenswrapper[4731]: I1203 19:23:58.085348 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c53f-account-create-update-vbgt5"] Dec 03 19:23:59 crc kubenswrapper[4731]: I1203 19:23:59.870408 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64c6991-7b2b-4432-8e99-ac9d1989b0ec" path="/var/lib/kubelet/pods/d64c6991-7b2b-4432-8e99-ac9d1989b0ec/volumes" Dec 03 19:24:02 crc kubenswrapper[4731]: I1203 19:24:02.084557 4731 generic.go:334] "Generic (PLEG): container finished" podID="d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" containerID="dad9b9415c859e7ff4b14913e881ecabaa84029e16f5f11fe060292fd8e7f853" exitCode=0 Dec 03 19:24:02 crc kubenswrapper[4731]: I1203 19:24:02.084647 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" event={"ID":"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26","Type":"ContainerDied","Data":"dad9b9415c859e7ff4b14913e881ecabaa84029e16f5f11fe060292fd8e7f853"} Dec 03 19:24:02 crc kubenswrapper[4731]: I1203 19:24:02.856048 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:24:02 crc kubenswrapper[4731]: E1203 19:24:02.857405 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.646985 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.755396 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory\") pod \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.755484 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55slf\" (UniqueName: \"kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf\") pod \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.755564 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key\") pod \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\" (UID: \"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26\") " Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.793029 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf" (OuterVolumeSpecName: "kube-api-access-55slf") pod "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" (UID: "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26"). InnerVolumeSpecName "kube-api-access-55slf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.800958 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" (UID: "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.827297 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory" (OuterVolumeSpecName: "inventory") pod "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" (UID: "d668caaa-0ba4-4cbe-8fce-8154cf9b8b26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.858896 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.858945 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55slf\" (UniqueName: \"kubernetes.io/projected/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-kube-api-access-55slf\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:03 crc kubenswrapper[4731]: I1203 19:24:03.858965 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d668caaa-0ba4-4cbe-8fce-8154cf9b8b26-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.141279 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" event={"ID":"d668caaa-0ba4-4cbe-8fce-8154cf9b8b26","Type":"ContainerDied","Data":"c00cef6767718820e54aa461af70afaea20943ee4d8cfbeb1c6ba92cdd0f7fd7"} Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.141326 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00cef6767718820e54aa461af70afaea20943ee4d8cfbeb1c6ba92cdd0f7fd7" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.141392 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.215199 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k"] Dec 03 19:24:04 crc kubenswrapper[4731]: E1203 19:24:04.215744 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.215766 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.216006 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d668caaa-0ba4-4cbe-8fce-8154cf9b8b26" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.216897 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.222743 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.223034 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.223080 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.223459 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.225962 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k"] Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.369542 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.369614 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.369731 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mfw\" (UniqueName: \"kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.472787 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.472910 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.473033 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mfw\" (UniqueName: \"kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.477875 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.478115 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.496666 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mfw\" (UniqueName: \"kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-69k6k\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:04 crc kubenswrapper[4731]: I1203 19:24:04.552679 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:05 crc kubenswrapper[4731]: I1203 19:24:05.140303 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k"] Dec 03 19:24:05 crc kubenswrapper[4731]: I1203 19:24:05.154551 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" event={"ID":"1728626c-786d-4913-9501-a8286b12f474","Type":"ContainerStarted","Data":"99972151960ce4525d56de7f3e1c94507438d0f1c9c450513da2d9b66fcd833d"} Dec 03 19:24:06 crc kubenswrapper[4731]: I1203 19:24:06.167686 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" event={"ID":"1728626c-786d-4913-9501-a8286b12f474","Type":"ContainerStarted","Data":"7c9253f6189e3c1597f50786934f7a875c67f4624845887d81a3e6c99c5a4da0"} Dec 03 19:24:06 crc kubenswrapper[4731]: I1203 19:24:06.195710 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" podStartSLOduration=1.7677463279999999 podStartE2EDuration="2.195677517s" podCreationTimestamp="2025-12-03 19:24:04 +0000 UTC" firstStartedPulling="2025-12-03 19:24:05.143667748 +0000 UTC m=+1765.742262212" lastFinishedPulling="2025-12-03 19:24:05.571598937 +0000 UTC m=+1766.170193401" observedRunningTime="2025-12-03 19:24:06.185743966 +0000 UTC m=+1766.784338450" watchObservedRunningTime="2025-12-03 19:24:06.195677517 +0000 UTC m=+1766.794271991" Dec 03 19:24:11 crc kubenswrapper[4731]: I1203 19:24:11.220732 4731 generic.go:334] "Generic (PLEG): container finished" podID="1728626c-786d-4913-9501-a8286b12f474" containerID="7c9253f6189e3c1597f50786934f7a875c67f4624845887d81a3e6c99c5a4da0" exitCode=0 Dec 03 19:24:11 crc kubenswrapper[4731]: I1203 19:24:11.220877 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" event={"ID":"1728626c-786d-4913-9501-a8286b12f474","Type":"ContainerDied","Data":"7c9253f6189e3c1597f50786934f7a875c67f4624845887d81a3e6c99c5a4da0"} Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.737533 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.873386 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mfw\" (UniqueName: \"kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw\") pod \"1728626c-786d-4913-9501-a8286b12f474\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.873459 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory\") pod \"1728626c-786d-4913-9501-a8286b12f474\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.873666 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key\") pod \"1728626c-786d-4913-9501-a8286b12f474\" (UID: \"1728626c-786d-4913-9501-a8286b12f474\") " Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.879446 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw" (OuterVolumeSpecName: "kube-api-access-w4mfw") pod "1728626c-786d-4913-9501-a8286b12f474" (UID: "1728626c-786d-4913-9501-a8286b12f474"). InnerVolumeSpecName "kube-api-access-w4mfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.902205 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1728626c-786d-4913-9501-a8286b12f474" (UID: "1728626c-786d-4913-9501-a8286b12f474"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.903069 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory" (OuterVolumeSpecName: "inventory") pod "1728626c-786d-4913-9501-a8286b12f474" (UID: "1728626c-786d-4913-9501-a8286b12f474"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.976503 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.976552 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mfw\" (UniqueName: \"kubernetes.io/projected/1728626c-786d-4913-9501-a8286b12f474-kube-api-access-w4mfw\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:12 crc kubenswrapper[4731]: I1203 19:24:12.976566 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1728626c-786d-4913-9501-a8286b12f474-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.286564 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" event={"ID":"1728626c-786d-4913-9501-a8286b12f474","Type":"ContainerDied","Data":"99972151960ce4525d56de7f3e1c94507438d0f1c9c450513da2d9b66fcd833d"} Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.286620 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99972151960ce4525d56de7f3e1c94507438d0f1c9c450513da2d9b66fcd833d" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.286696 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-69k6k" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.338737 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4"] Dec 03 19:24:13 crc kubenswrapper[4731]: E1203 19:24:13.339373 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1728626c-786d-4913-9501-a8286b12f474" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.339399 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1728626c-786d-4913-9501-a8286b12f474" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.339638 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1728626c-786d-4913-9501-a8286b12f474" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.340640 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.344123 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.344867 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.345068 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.345226 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.348789 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4"] Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.486713 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.486820 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.486917 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76gr\" (UniqueName: \"kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.589689 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.589836 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.589883 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76gr\" (UniqueName: \"kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.595993 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.602877 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.613886 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76gr\" (UniqueName: \"kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sfjq4\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:13 crc kubenswrapper[4731]: I1203 19:24:13.663372 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:14 crc kubenswrapper[4731]: I1203 19:24:14.234128 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4"] Dec 03 19:24:14 crc kubenswrapper[4731]: I1203 19:24:14.299315 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" event={"ID":"d77a7c18-1300-42ea-8c28-910dcea576ff","Type":"ContainerStarted","Data":"b95de737f5727ea8a96bb679eb2f13a3129b09878629ed8ee00854f8b42f3148"} Dec 03 19:24:14 crc kubenswrapper[4731]: I1203 19:24:14.855978 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:24:14 crc kubenswrapper[4731]: E1203 19:24:14.856409 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:24:15 crc kubenswrapper[4731]: I1203 19:24:15.319907 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" event={"ID":"d77a7c18-1300-42ea-8c28-910dcea576ff","Type":"ContainerStarted","Data":"2eb428b6db8c3c9a9d3aab825cc0f76b6080008366fd78b6c5b67b652e66c303"} Dec 03 19:24:15 crc kubenswrapper[4731]: I1203 19:24:15.356450 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" podStartSLOduration=1.918651978 podStartE2EDuration="2.356418644s" podCreationTimestamp="2025-12-03 19:24:13 +0000 UTC" firstStartedPulling="2025-12-03 19:24:14.23874075 +0000 UTC m=+1774.837335214" lastFinishedPulling="2025-12-03 19:24:14.676507416 +0000 UTC m=+1775.275101880" observedRunningTime="2025-12-03 19:24:15.341965342 +0000 UTC m=+1775.940559886" watchObservedRunningTime="2025-12-03 19:24:15.356418644 +0000 UTC m=+1775.955013118" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.415759 4731 scope.go:117] "RemoveContainer" containerID="3543322e0433654a37eee878cab3b4bbbc3fa0af2fe9c9a0972efffd1f2eae76" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.481416 4731 scope.go:117] "RemoveContainer" containerID="727d85169cb5d22ec89098dd70aeedcb5893b3202cfa46965abdb2b302422cc6" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.507397 4731 scope.go:117] "RemoveContainer" containerID="efdb1ff6a0fcba15ad7c8ec2da260a07ec9bcbbdab3c0c63643abf3612e90a4f" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.548439 4731 scope.go:117] "RemoveContainer" containerID="badfb0e4c93ca53b210be7ff8c1211507828c501206f2e02920f6269450aa1bb" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.601221 4731 scope.go:117] "RemoveContainer" containerID="a8d912d9e70a0de3d02a17fb58f5268c48a72eb303961127fe5ffc6be5f190fe" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.645316 4731 scope.go:117] "RemoveContainer" containerID="46feab51ee5e267f863d80bcdd090558775a80dc0fe9726b173c4f51b8b4d2d2" Dec 03 19:24:19 crc kubenswrapper[4731]: I1203 19:24:19.688602 4731 scope.go:117] "RemoveContainer" containerID="5246cba9874413165f2d2d3b2375af0bc5ab26225941a46ef8a97fe67863cb92" Dec 03 19:24:28 crc kubenswrapper[4731]: I1203 19:24:28.857497 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:24:29 crc kubenswrapper[4731]: I1203 19:24:29.458764 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110"} Dec 03 19:24:30 crc kubenswrapper[4731]: I1203 19:24:30.045719 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nrxcp"] Dec 03 19:24:30 crc kubenswrapper[4731]: I1203 19:24:30.054798 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nrxcp"] Dec 03 19:24:31 crc kubenswrapper[4731]: I1203 19:24:31.869179 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996da76c-8786-41f9-aa84-4722e2755f0e" path="/var/lib/kubelet/pods/996da76c-8786-41f9-aa84-4722e2755f0e/volumes" Dec 03 19:24:55 crc kubenswrapper[4731]: I1203 19:24:55.747008 4731 generic.go:334] "Generic (PLEG): container finished" podID="d77a7c18-1300-42ea-8c28-910dcea576ff" containerID="2eb428b6db8c3c9a9d3aab825cc0f76b6080008366fd78b6c5b67b652e66c303" exitCode=0 Dec 03 19:24:55 crc kubenswrapper[4731]: I1203 19:24:55.747095 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" event={"ID":"d77a7c18-1300-42ea-8c28-910dcea576ff","Type":"ContainerDied","Data":"2eb428b6db8c3c9a9d3aab825cc0f76b6080008366fd78b6c5b67b652e66c303"} Dec 03 19:24:56 crc kubenswrapper[4731]: I1203 19:24:56.073759 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rbmpj"] Dec 03 19:24:56 crc kubenswrapper[4731]: I1203 19:24:56.087415 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rbmpj"] Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.041172 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fcx2"] Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.064409 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5fcx2"] Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.215072 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.383540 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory\") pod \"d77a7c18-1300-42ea-8c28-910dcea576ff\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.383669 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v76gr\" (UniqueName: \"kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr\") pod \"d77a7c18-1300-42ea-8c28-910dcea576ff\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.383866 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key\") pod \"d77a7c18-1300-42ea-8c28-910dcea576ff\" (UID: \"d77a7c18-1300-42ea-8c28-910dcea576ff\") " Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.405215 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr" (OuterVolumeSpecName: "kube-api-access-v76gr") pod "d77a7c18-1300-42ea-8c28-910dcea576ff" (UID: "d77a7c18-1300-42ea-8c28-910dcea576ff"). InnerVolumeSpecName "kube-api-access-v76gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.425697 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d77a7c18-1300-42ea-8c28-910dcea576ff" (UID: "d77a7c18-1300-42ea-8c28-910dcea576ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.429324 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory" (OuterVolumeSpecName: "inventory") pod "d77a7c18-1300-42ea-8c28-910dcea576ff" (UID: "d77a7c18-1300-42ea-8c28-910dcea576ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.486714 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.487139 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77a7c18-1300-42ea-8c28-910dcea576ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.487150 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v76gr\" (UniqueName: \"kubernetes.io/projected/d77a7c18-1300-42ea-8c28-910dcea576ff-kube-api-access-v76gr\") on node \"crc\" DevicePath \"\"" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.771833 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" event={"ID":"d77a7c18-1300-42ea-8c28-910dcea576ff","Type":"ContainerDied","Data":"b95de737f5727ea8a96bb679eb2f13a3129b09878629ed8ee00854f8b42f3148"} Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.771900 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sfjq4" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.771903 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b95de737f5727ea8a96bb679eb2f13a3129b09878629ed8ee00854f8b42f3148" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.871309 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676bb34b-7c07-46fb-bf1b-62e21e5293f8" path="/var/lib/kubelet/pods/676bb34b-7c07-46fb-bf1b-62e21e5293f8/volumes" Dec 03 19:24:57 crc kubenswrapper[4731]: I1203 19:24:57.872885 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0" path="/var/lib/kubelet/pods/7b0a2f42-1aee-41ee-87ce-ddd561f3c7a0/volumes" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.042459 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p"] Dec 03 19:24:58 crc kubenswrapper[4731]: E1203 19:24:58.043232 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77a7c18-1300-42ea-8c28-910dcea576ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.043270 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77a7c18-1300-42ea-8c28-910dcea576ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.043497 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77a7c18-1300-42ea-8c28-910dcea576ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.046621 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.049122 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.049749 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.049891 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.050149 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.056119 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p"] Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.205431 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zrn\" (UniqueName: \"kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.205500 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.205666 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.307290 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zrn\" (UniqueName: \"kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.307377 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.307478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.314399 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.314455 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.332998 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zrn\" (UniqueName: \"kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2826p\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:58 crc kubenswrapper[4731]: I1203 19:24:58.380110 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:24:59 crc kubenswrapper[4731]: I1203 19:24:59.115909 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p"] Dec 03 19:24:59 crc kubenswrapper[4731]: I1203 19:24:59.796318 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" event={"ID":"1bdcc986-86f8-47a5-9856-b3e0969e9d29","Type":"ContainerStarted","Data":"d8b206eadbfafdf4d2279f27d1c3966239d60c4571d66d3fffe9b8bcafdbae0c"} Dec 03 19:24:59 crc kubenswrapper[4731]: I1203 19:24:59.796852 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" event={"ID":"1bdcc986-86f8-47a5-9856-b3e0969e9d29","Type":"ContainerStarted","Data":"f11d79dfffe4ae1f35737d6c404aebbea8d7c719205df51a3545ddf2d2d06cce"} Dec 03 19:25:19 crc kubenswrapper[4731]: I1203 19:25:19.887726 4731 scope.go:117] "RemoveContainer" containerID="524660bf6f72d8fcc0b4c01e70ecca6b752e42fcdbf72e4b972d4019db523655" Dec 03 19:25:19 crc kubenswrapper[4731]: I1203 19:25:19.947384 4731 scope.go:117] "RemoveContainer" containerID="76137f6787685f877f7650d2c7f7cc5b85e27124453be78f4ae6258123f56ba9" Dec 03 19:25:20 crc kubenswrapper[4731]: I1203 19:25:20.014708 4731 scope.go:117] "RemoveContainer" containerID="67998100f37b7a848af7004d44074badb97c0e3bd8c716933a0f6321765c73ef" Dec 03 19:25:41 crc kubenswrapper[4731]: I1203 19:25:41.043817 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" podStartSLOduration=42.61466939 podStartE2EDuration="43.043794087s" podCreationTimestamp="2025-12-03 19:24:58 +0000 UTC" firstStartedPulling="2025-12-03 19:24:59.123241544 +0000 UTC m=+1819.721836008" lastFinishedPulling="2025-12-03 19:24:59.552366241 +0000 UTC m=+1820.150960705" observedRunningTime="2025-12-03 19:24:59.819165694 +0000 UTC m=+1820.417760178" watchObservedRunningTime="2025-12-03 19:25:41.043794087 +0000 UTC m=+1861.642388551" Dec 03 19:25:41 crc kubenswrapper[4731]: I1203 19:25:41.050354 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-m77rq"] Dec 03 19:25:41 crc kubenswrapper[4731]: I1203 19:25:41.059383 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-m77rq"] Dec 03 19:25:41 crc kubenswrapper[4731]: I1203 19:25:41.870831 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a4b141-8350-4132-8173-04c0dc9cc328" path="/var/lib/kubelet/pods/01a4b141-8350-4132-8173-04c0dc9cc328/volumes" Dec 03 19:25:57 crc kubenswrapper[4731]: I1203 19:25:57.407042 4731 generic.go:334] "Generic (PLEG): container finished" podID="1bdcc986-86f8-47a5-9856-b3e0969e9d29" containerID="d8b206eadbfafdf4d2279f27d1c3966239d60c4571d66d3fffe9b8bcafdbae0c" exitCode=0 Dec 03 19:25:57 crc kubenswrapper[4731]: I1203 19:25:57.407141 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" event={"ID":"1bdcc986-86f8-47a5-9856-b3e0969e9d29","Type":"ContainerDied","Data":"d8b206eadbfafdf4d2279f27d1c3966239d60c4571d66d3fffe9b8bcafdbae0c"} Dec 03 19:25:58 crc kubenswrapper[4731]: I1203 19:25:58.824042 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:25:58 crc kubenswrapper[4731]: I1203 19:25:58.958276 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory\") pod \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " Dec 03 19:25:58 crc kubenswrapper[4731]: I1203 19:25:58.958600 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8zrn\" (UniqueName: \"kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn\") pod \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " Dec 03 19:25:58 crc kubenswrapper[4731]: I1203 19:25:58.958687 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key\") pod \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\" (UID: \"1bdcc986-86f8-47a5-9856-b3e0969e9d29\") " Dec 03 19:25:58 crc kubenswrapper[4731]: I1203 19:25:58.966567 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn" (OuterVolumeSpecName: "kube-api-access-q8zrn") pod "1bdcc986-86f8-47a5-9856-b3e0969e9d29" (UID: "1bdcc986-86f8-47a5-9856-b3e0969e9d29"). InnerVolumeSpecName "kube-api-access-q8zrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.007601 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory" (OuterVolumeSpecName: "inventory") pod "1bdcc986-86f8-47a5-9856-b3e0969e9d29" (UID: "1bdcc986-86f8-47a5-9856-b3e0969e9d29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.014006 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1bdcc986-86f8-47a5-9856-b3e0969e9d29" (UID: "1bdcc986-86f8-47a5-9856-b3e0969e9d29"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.060829 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8zrn\" (UniqueName: \"kubernetes.io/projected/1bdcc986-86f8-47a5-9856-b3e0969e9d29-kube-api-access-q8zrn\") on node \"crc\" DevicePath \"\"" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.060871 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.060881 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdcc986-86f8-47a5-9856-b3e0969e9d29-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.430367 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" event={"ID":"1bdcc986-86f8-47a5-9856-b3e0969e9d29","Type":"ContainerDied","Data":"f11d79dfffe4ae1f35737d6c404aebbea8d7c719205df51a3545ddf2d2d06cce"} Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.430723 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11d79dfffe4ae1f35737d6c404aebbea8d7c719205df51a3545ddf2d2d06cce" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.430627 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2826p" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.543198 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rsrfq"] Dec 03 19:25:59 crc kubenswrapper[4731]: E1203 19:25:59.543788 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdcc986-86f8-47a5-9856-b3e0969e9d29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.543811 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdcc986-86f8-47a5-9856-b3e0969e9d29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.544054 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdcc986-86f8-47a5-9856-b3e0969e9d29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.544853 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.547518 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.547933 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.548084 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.548537 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.555558 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rsrfq"] Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.683521 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.683579 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtc2\" (UniqueName: \"kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.683712 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.787041 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.787174 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.787205 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtc2\" (UniqueName: \"kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.792839 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.794125 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.806755 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtc2\" (UniqueName: \"kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2\") pod \"ssh-known-hosts-edpm-deployment-rsrfq\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:25:59 crc kubenswrapper[4731]: I1203 19:25:59.898470 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:26:00 crc kubenswrapper[4731]: I1203 19:26:00.401803 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rsrfq"] Dec 03 19:26:00 crc kubenswrapper[4731]: I1203 19:26:00.407380 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:26:00 crc kubenswrapper[4731]: I1203 19:26:00.442425 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" event={"ID":"2ec83323-6f0b-4824-aff6-c68a5b5628cd","Type":"ContainerStarted","Data":"9749658775e3bcbf1e72b9921f5c43687b6c4b9edc68aec37d477fc23e279f84"} Dec 03 19:26:01 crc kubenswrapper[4731]: I1203 19:26:01.454347 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" event={"ID":"2ec83323-6f0b-4824-aff6-c68a5b5628cd","Type":"ContainerStarted","Data":"6f0a0fc824dbfb408e0ee33d628ba315059bc2788fe1551ddd0dc85c74f02322"} Dec 03 19:26:01 crc kubenswrapper[4731]: I1203 19:26:01.476688 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" podStartSLOduration=2.075605248 podStartE2EDuration="2.47666387s" podCreationTimestamp="2025-12-03 19:25:59 +0000 UTC" firstStartedPulling="2025-12-03 19:26:00.407135762 +0000 UTC m=+1881.005730226" lastFinishedPulling="2025-12-03 19:26:00.808194384 +0000 UTC m=+1881.406788848" observedRunningTime="2025-12-03 19:26:01.471729187 +0000 UTC m=+1882.070323651" watchObservedRunningTime="2025-12-03 19:26:01.47666387 +0000 UTC m=+1882.075258334" Dec 03 19:26:09 crc kubenswrapper[4731]: I1203 19:26:09.564368 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" event={"ID":"2ec83323-6f0b-4824-aff6-c68a5b5628cd","Type":"ContainerDied","Data":"6f0a0fc824dbfb408e0ee33d628ba315059bc2788fe1551ddd0dc85c74f02322"} Dec 03 19:26:09 crc kubenswrapper[4731]: I1203 19:26:09.564344 4731 generic.go:334] "Generic (PLEG): container finished" podID="2ec83323-6f0b-4824-aff6-c68a5b5628cd" containerID="6f0a0fc824dbfb408e0ee33d628ba315059bc2788fe1551ddd0dc85c74f02322" exitCode=0 Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.011362 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.153385 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwtc2\" (UniqueName: \"kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2\") pod \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.153526 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0\") pod \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.153642 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam\") pod \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\" (UID: \"2ec83323-6f0b-4824-aff6-c68a5b5628cd\") " Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.163734 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2" (OuterVolumeSpecName: "kube-api-access-vwtc2") pod "2ec83323-6f0b-4824-aff6-c68a5b5628cd" (UID: "2ec83323-6f0b-4824-aff6-c68a5b5628cd"). InnerVolumeSpecName "kube-api-access-vwtc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.185630 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ec83323-6f0b-4824-aff6-c68a5b5628cd" (UID: "2ec83323-6f0b-4824-aff6-c68a5b5628cd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.206679 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2ec83323-6f0b-4824-aff6-c68a5b5628cd" (UID: "2ec83323-6f0b-4824-aff6-c68a5b5628cd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.256480 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwtc2\" (UniqueName: \"kubernetes.io/projected/2ec83323-6f0b-4824-aff6-c68a5b5628cd-kube-api-access-vwtc2\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.256731 4731 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.256804 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ec83323-6f0b-4824-aff6-c68a5b5628cd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.583873 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" event={"ID":"2ec83323-6f0b-4824-aff6-c68a5b5628cd","Type":"ContainerDied","Data":"9749658775e3bcbf1e72b9921f5c43687b6c4b9edc68aec37d477fc23e279f84"} Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.584177 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9749658775e3bcbf1e72b9921f5c43687b6c4b9edc68aec37d477fc23e279f84" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.583923 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rsrfq" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.699542 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9"] Dec 03 19:26:11 crc kubenswrapper[4731]: E1203 19:26:11.700115 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec83323-6f0b-4824-aff6-c68a5b5628cd" containerName="ssh-known-hosts-edpm-deployment" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.700140 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec83323-6f0b-4824-aff6-c68a5b5628cd" containerName="ssh-known-hosts-edpm-deployment" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.700475 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec83323-6f0b-4824-aff6-c68a5b5628cd" containerName="ssh-known-hosts-edpm-deployment" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.701364 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.704232 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.704234 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.704864 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.704868 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.724339 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9"] Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.872091 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnkxk\" (UniqueName: \"kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.872213 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.872504 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.976164 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.976247 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnkxk\" (UniqueName: \"kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.976313 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:11 crc kubenswrapper[4731]: I1203 19:26:11.981701 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:12 crc kubenswrapper[4731]: I1203 19:26:12.003599 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:12 crc kubenswrapper[4731]: I1203 19:26:12.005990 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnkxk\" (UniqueName: \"kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlcd9\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:12 crc kubenswrapper[4731]: I1203 19:26:12.021787 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:12 crc kubenswrapper[4731]: I1203 19:26:12.549749 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9"] Dec 03 19:26:12 crc kubenswrapper[4731]: I1203 19:26:12.596729 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" event={"ID":"65e0409b-7c26-42bc-9543-f82d0d6a1d5d","Type":"ContainerStarted","Data":"ffd0fc167645a04b580a2008a7ef94c667930c7b79dc9c8617bae9035f5c6279"} Dec 03 19:26:13 crc kubenswrapper[4731]: I1203 19:26:13.612740 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" event={"ID":"65e0409b-7c26-42bc-9543-f82d0d6a1d5d","Type":"ContainerStarted","Data":"0d73e549ea5ee28c435b7d5a173e5ce5d10c59ebb5c53be888933db7a9cb8455"} Dec 03 19:26:13 crc kubenswrapper[4731]: I1203 19:26:13.648046 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" podStartSLOduration=2.1308584919999998 podStartE2EDuration="2.648019273s" podCreationTimestamp="2025-12-03 19:26:11 +0000 UTC" firstStartedPulling="2025-12-03 19:26:12.554011865 +0000 UTC m=+1893.152606329" lastFinishedPulling="2025-12-03 19:26:13.071172646 +0000 UTC m=+1893.669767110" observedRunningTime="2025-12-03 19:26:13.63473056 +0000 UTC m=+1894.233325024" watchObservedRunningTime="2025-12-03 19:26:13.648019273 +0000 UTC m=+1894.246613737" Dec 03 19:26:20 crc kubenswrapper[4731]: I1203 19:26:20.132710 4731 scope.go:117] "RemoveContainer" containerID="e61e1ffa8e0f12f675542c84b1c21b31711ca6f6ca3d838a6c44803978f355f8" Dec 03 19:26:21 crc kubenswrapper[4731]: I1203 19:26:21.700908 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" event={"ID":"65e0409b-7c26-42bc-9543-f82d0d6a1d5d","Type":"ContainerDied","Data":"0d73e549ea5ee28c435b7d5a173e5ce5d10c59ebb5c53be888933db7a9cb8455"} Dec 03 19:26:21 crc kubenswrapper[4731]: I1203 19:26:21.700857 4731 generic.go:334] "Generic (PLEG): container finished" podID="65e0409b-7c26-42bc-9543-f82d0d6a1d5d" containerID="0d73e549ea5ee28c435b7d5a173e5ce5d10c59ebb5c53be888933db7a9cb8455" exitCode=0 Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.139292 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.322545 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnkxk\" (UniqueName: \"kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk\") pod \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.322613 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key\") pod \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.322721 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory\") pod \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\" (UID: \"65e0409b-7c26-42bc-9543-f82d0d6a1d5d\") " Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.331381 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk" (OuterVolumeSpecName: "kube-api-access-jnkxk") pod "65e0409b-7c26-42bc-9543-f82d0d6a1d5d" (UID: "65e0409b-7c26-42bc-9543-f82d0d6a1d5d"). InnerVolumeSpecName "kube-api-access-jnkxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.359710 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65e0409b-7c26-42bc-9543-f82d0d6a1d5d" (UID: "65e0409b-7c26-42bc-9543-f82d0d6a1d5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.361606 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory" (OuterVolumeSpecName: "inventory") pod "65e0409b-7c26-42bc-9543-f82d0d6a1d5d" (UID: "65e0409b-7c26-42bc-9543-f82d0d6a1d5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.425623 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnkxk\" (UniqueName: \"kubernetes.io/projected/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-kube-api-access-jnkxk\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.425672 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.425684 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65e0409b-7c26-42bc-9543-f82d0d6a1d5d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.720693 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" event={"ID":"65e0409b-7c26-42bc-9543-f82d0d6a1d5d","Type":"ContainerDied","Data":"ffd0fc167645a04b580a2008a7ef94c667930c7b79dc9c8617bae9035f5c6279"} Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.721177 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd0fc167645a04b580a2008a7ef94c667930c7b79dc9c8617bae9035f5c6279" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.720812 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlcd9" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.809885 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6"] Dec 03 19:26:23 crc kubenswrapper[4731]: E1203 19:26:23.810503 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e0409b-7c26-42bc-9543-f82d0d6a1d5d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.810530 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e0409b-7c26-42bc-9543-f82d0d6a1d5d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.810734 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e0409b-7c26-42bc-9543-f82d0d6a1d5d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.811595 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.815806 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.815930 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.816069 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.816158 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.828043 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6"] Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.833413 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.833732 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82z4r\" (UniqueName: \"kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.833834 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.935479 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82z4r\" (UniqueName: \"kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.935999 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.936106 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.940969 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.944841 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:23 crc kubenswrapper[4731]: I1203 19:26:23.962001 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82z4r\" (UniqueName: \"kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:24 crc kubenswrapper[4731]: I1203 19:26:24.132710 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:24 crc kubenswrapper[4731]: I1203 19:26:24.700368 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6"] Dec 03 19:26:24 crc kubenswrapper[4731]: I1203 19:26:24.731538 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" event={"ID":"3c5a2880-9f98-4d8c-ac95-a9b784692c44","Type":"ContainerStarted","Data":"0eb7e79246e6f864da0a57c0a1a41dbc54239880034e862e6e3531cd76a3d820"} Dec 03 19:26:25 crc kubenswrapper[4731]: I1203 19:26:25.742036 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" event={"ID":"3c5a2880-9f98-4d8c-ac95-a9b784692c44","Type":"ContainerStarted","Data":"9dc8407e22d5f960691cfb752a52e375e1dbb54421997be934fcfae3c2b0657d"} Dec 03 19:26:25 crc kubenswrapper[4731]: I1203 19:26:25.759874 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" podStartSLOduration=2.306090606 podStartE2EDuration="2.759850874s" podCreationTimestamp="2025-12-03 19:26:23 +0000 UTC" firstStartedPulling="2025-12-03 19:26:24.701560578 +0000 UTC m=+1905.300155042" lastFinishedPulling="2025-12-03 19:26:25.155320846 +0000 UTC m=+1905.753915310" observedRunningTime="2025-12-03 19:26:25.755933153 +0000 UTC m=+1906.354527617" watchObservedRunningTime="2025-12-03 19:26:25.759850874 +0000 UTC m=+1906.358445338" Dec 03 19:26:35 crc kubenswrapper[4731]: I1203 19:26:35.854510 4731 generic.go:334] "Generic (PLEG): container finished" podID="3c5a2880-9f98-4d8c-ac95-a9b784692c44" containerID="9dc8407e22d5f960691cfb752a52e375e1dbb54421997be934fcfae3c2b0657d" exitCode=0 Dec 03 19:26:35 crc kubenswrapper[4731]: I1203 19:26:35.855593 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" event={"ID":"3c5a2880-9f98-4d8c-ac95-a9b784692c44","Type":"ContainerDied","Data":"9dc8407e22d5f960691cfb752a52e375e1dbb54421997be934fcfae3c2b0657d"} Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.373974 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.431829 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory\") pod \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.432498 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82z4r\" (UniqueName: \"kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r\") pod \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.432535 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key\") pod \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\" (UID: \"3c5a2880-9f98-4d8c-ac95-a9b784692c44\") " Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.448754 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r" (OuterVolumeSpecName: "kube-api-access-82z4r") pod "3c5a2880-9f98-4d8c-ac95-a9b784692c44" (UID: "3c5a2880-9f98-4d8c-ac95-a9b784692c44"). InnerVolumeSpecName "kube-api-access-82z4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.474484 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c5a2880-9f98-4d8c-ac95-a9b784692c44" (UID: "3c5a2880-9f98-4d8c-ac95-a9b784692c44"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.476745 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory" (OuterVolumeSpecName: "inventory") pod "3c5a2880-9f98-4d8c-ac95-a9b784692c44" (UID: "3c5a2880-9f98-4d8c-ac95-a9b784692c44"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.535349 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82z4r\" (UniqueName: \"kubernetes.io/projected/3c5a2880-9f98-4d8c-ac95-a9b784692c44-kube-api-access-82z4r\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.535464 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.535487 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a2880-9f98-4d8c-ac95-a9b784692c44-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.877595 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" event={"ID":"3c5a2880-9f98-4d8c-ac95-a9b784692c44","Type":"ContainerDied","Data":"0eb7e79246e6f864da0a57c0a1a41dbc54239880034e862e6e3531cd76a3d820"} Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.877675 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb7e79246e6f864da0a57c0a1a41dbc54239880034e862e6e3531cd76a3d820" Dec 03 19:26:37 crc kubenswrapper[4731]: I1203 19:26:37.878075 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.004657 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6"] Dec 03 19:26:38 crc kubenswrapper[4731]: E1203 19:26:38.005380 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a2880-9f98-4d8c-ac95-a9b784692c44" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.005402 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a2880-9f98-4d8c-ac95-a9b784692c44" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.005602 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a2880-9f98-4d8c-ac95-a9b784692c44" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.006372 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.015317 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.015688 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.016164 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.016483 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.016665 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.016779 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.016932 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.017177 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.026794 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6"] Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150603 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150666 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150799 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150848 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150924 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnc2\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.150975 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151035 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151123 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151190 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151299 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151348 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151461 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151509 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.151572 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.255879 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.255961 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256065 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256095 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256135 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256176 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256217 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xnc2\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256242 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256301 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256329 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256363 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256437 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.256526 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.264273 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.265505 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.266293 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.267117 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.267314 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.268175 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.268341 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.268419 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.268891 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.269510 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.269669 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.270117 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.276406 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.277053 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xnc2\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.370953 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:26:38 crc kubenswrapper[4731]: I1203 19:26:38.927749 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6"] Dec 03 19:26:39 crc kubenswrapper[4731]: I1203 19:26:39.903160 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" event={"ID":"500db9f6-205e-4e09-a8a4-f0e1bf42e867","Type":"ContainerStarted","Data":"4bb38612dc6b95c559923ba3519b041d0e1530bb5488d6c03ad2b716fdb9a479"} Dec 03 19:26:39 crc kubenswrapper[4731]: I1203 19:26:39.903501 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" event={"ID":"500db9f6-205e-4e09-a8a4-f0e1bf42e867","Type":"ContainerStarted","Data":"97c2e96fb5726f93e7ff10ca03a64397a33ea4fc52f595ad4715a7228788af44"} Dec 03 19:26:39 crc kubenswrapper[4731]: I1203 19:26:39.934549 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" podStartSLOduration=2.26074039 podStartE2EDuration="2.934521511s" podCreationTimestamp="2025-12-03 19:26:37 +0000 UTC" firstStartedPulling="2025-12-03 19:26:38.937064705 +0000 UTC m=+1919.535659169" lastFinishedPulling="2025-12-03 19:26:39.610845826 +0000 UTC m=+1920.209440290" observedRunningTime="2025-12-03 19:26:39.921648571 +0000 UTC m=+1920.520243035" watchObservedRunningTime="2025-12-03 19:26:39.934521511 +0000 UTC m=+1920.533115975" Dec 03 19:26:56 crc kubenswrapper[4731]: I1203 19:26:56.468567 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:26:56 crc kubenswrapper[4731]: I1203 19:26:56.469086 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:27:20 crc kubenswrapper[4731]: E1203 19:27:20.939221 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500db9f6_205e_4e09_a8a4_f0e1bf42e867.slice/crio-conmon-4bb38612dc6b95c559923ba3519b041d0e1530bb5488d6c03ad2b716fdb9a479.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500db9f6_205e_4e09_a8a4_f0e1bf42e867.slice/crio-4bb38612dc6b95c559923ba3519b041d0e1530bb5488d6c03ad2b716fdb9a479.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:27:21 crc kubenswrapper[4731]: I1203 19:27:21.357221 4731 generic.go:334] "Generic (PLEG): container finished" podID="500db9f6-205e-4e09-a8a4-f0e1bf42e867" containerID="4bb38612dc6b95c559923ba3519b041d0e1530bb5488d6c03ad2b716fdb9a479" exitCode=0 Dec 03 19:27:21 crc kubenswrapper[4731]: I1203 19:27:21.357299 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" event={"ID":"500db9f6-205e-4e09-a8a4-f0e1bf42e867","Type":"ContainerDied","Data":"4bb38612dc6b95c559923ba3519b041d0e1530bb5488d6c03ad2b716fdb9a479"} Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.754671 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840419 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840480 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840508 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840543 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840561 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840614 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840636 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xnc2\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840677 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840704 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840742 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840764 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840794 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840846 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.840945 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\" (UID: \"500db9f6-205e-4e09-a8a4-f0e1bf42e867\") " Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.850035 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.850722 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.850937 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.851648 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2" (OuterVolumeSpecName: "kube-api-access-9xnc2") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "kube-api-access-9xnc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.851754 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.851813 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.851977 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.853373 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.854490 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.855077 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.856083 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.856192 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.877287 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.880380 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory" (OuterVolumeSpecName: "inventory") pod "500db9f6-205e-4e09-a8a4-f0e1bf42e867" (UID: "500db9f6-205e-4e09-a8a4-f0e1bf42e867"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944507 4731 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944547 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944560 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944575 4731 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944586 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944611 4731 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944621 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944631 4731 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944641 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xnc2\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-kube-api-access-9xnc2\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944652 4731 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944668 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944677 4731 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944686 4731 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500db9f6-205e-4e09-a8a4-f0e1bf42e867-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:22 crc kubenswrapper[4731]: I1203 19:27:22.944697 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/500db9f6-205e-4e09-a8a4-f0e1bf42e867-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.379338 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" event={"ID":"500db9f6-205e-4e09-a8a4-f0e1bf42e867","Type":"ContainerDied","Data":"97c2e96fb5726f93e7ff10ca03a64397a33ea4fc52f595ad4715a7228788af44"} Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.379453 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c2e96fb5726f93e7ff10ca03a64397a33ea4fc52f595ad4715a7228788af44" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.379446 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.540050 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx"] Dec 03 19:27:23 crc kubenswrapper[4731]: E1203 19:27:23.540630 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500db9f6-205e-4e09-a8a4-f0e1bf42e867" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.540657 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="500db9f6-205e-4e09-a8a4-f0e1bf42e867" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.540859 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="500db9f6-205e-4e09-a8a4-f0e1bf42e867" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.541764 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.544316 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.544554 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.544912 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.544920 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.546049 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.551961 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx"] Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.657514 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.657587 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.657731 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ldk4\" (UniqueName: \"kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.657772 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.657841 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.759486 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.759580 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.759609 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.759888 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ldk4\" (UniqueName: \"kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.761366 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.761517 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.765892 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.767942 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.768232 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.788751 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ldk4\" (UniqueName: \"kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6thtx\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:23 crc kubenswrapper[4731]: I1203 19:27:23.867635 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:27:24 crc kubenswrapper[4731]: I1203 19:27:24.418771 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx"] Dec 03 19:27:25 crc kubenswrapper[4731]: I1203 19:27:25.402316 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" event={"ID":"732d4254-41b9-4098-87cd-223787bf455e","Type":"ContainerStarted","Data":"c849e37b5aaad5b28ffe2ae7636d956ae058a770b1857247cbc8d9f957c5830c"} Dec 03 19:27:25 crc kubenswrapper[4731]: I1203 19:27:25.402770 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" event={"ID":"732d4254-41b9-4098-87cd-223787bf455e","Type":"ContainerStarted","Data":"ecfc6418722098cbec0776eef049ac9656bc2cf3f24ae6c5db2eb57b358a6c22"} Dec 03 19:27:25 crc kubenswrapper[4731]: I1203 19:27:25.429345 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" podStartSLOduration=2.040224259 podStartE2EDuration="2.429319428s" podCreationTimestamp="2025-12-03 19:27:23 +0000 UTC" firstStartedPulling="2025-12-03 19:27:24.421962924 +0000 UTC m=+1965.020557378" lastFinishedPulling="2025-12-03 19:27:24.811058083 +0000 UTC m=+1965.409652547" observedRunningTime="2025-12-03 19:27:25.422471404 +0000 UTC m=+1966.021065888" watchObservedRunningTime="2025-12-03 19:27:25.429319428 +0000 UTC m=+1966.027913892" Dec 03 19:27:26 crc kubenswrapper[4731]: I1203 19:27:26.468829 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:27:26 crc kubenswrapper[4731]: I1203 19:27:26.468932 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.468591 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.469225 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.469386 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.470434 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.470507 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110" gracePeriod=600 Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.765222 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110" exitCode=0 Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.765606 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110"} Dec 03 19:27:56 crc kubenswrapper[4731]: I1203 19:27:56.765711 4731 scope.go:117] "RemoveContainer" containerID="084f93a5611050f53560c1ea1840a4143a105fbc6fdce1a2f17d2337d2acb808" Dec 03 19:27:57 crc kubenswrapper[4731]: I1203 19:27:57.780211 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331"} Dec 03 19:28:32 crc kubenswrapper[4731]: I1203 19:28:32.152818 4731 generic.go:334] "Generic (PLEG): container finished" podID="732d4254-41b9-4098-87cd-223787bf455e" containerID="c849e37b5aaad5b28ffe2ae7636d956ae058a770b1857247cbc8d9f957c5830c" exitCode=0 Dec 03 19:28:32 crc kubenswrapper[4731]: I1203 19:28:32.153426 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" event={"ID":"732d4254-41b9-4098-87cd-223787bf455e","Type":"ContainerDied","Data":"c849e37b5aaad5b28ffe2ae7636d956ae058a770b1857247cbc8d9f957c5830c"} Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.603720 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.649314 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ldk4\" (UniqueName: \"kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4\") pod \"732d4254-41b9-4098-87cd-223787bf455e\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.649458 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0\") pod \"732d4254-41b9-4098-87cd-223787bf455e\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.649505 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle\") pod \"732d4254-41b9-4098-87cd-223787bf455e\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.649596 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key\") pod \"732d4254-41b9-4098-87cd-223787bf455e\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.649705 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory\") pod \"732d4254-41b9-4098-87cd-223787bf455e\" (UID: \"732d4254-41b9-4098-87cd-223787bf455e\") " Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.656919 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "732d4254-41b9-4098-87cd-223787bf455e" (UID: "732d4254-41b9-4098-87cd-223787bf455e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.657123 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4" (OuterVolumeSpecName: "kube-api-access-8ldk4") pod "732d4254-41b9-4098-87cd-223787bf455e" (UID: "732d4254-41b9-4098-87cd-223787bf455e"). InnerVolumeSpecName "kube-api-access-8ldk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.677416 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "732d4254-41b9-4098-87cd-223787bf455e" (UID: "732d4254-41b9-4098-87cd-223787bf455e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.682146 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory" (OuterVolumeSpecName: "inventory") pod "732d4254-41b9-4098-87cd-223787bf455e" (UID: "732d4254-41b9-4098-87cd-223787bf455e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.682522 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "732d4254-41b9-4098-87cd-223787bf455e" (UID: "732d4254-41b9-4098-87cd-223787bf455e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.753185 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.753328 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ldk4\" (UniqueName: \"kubernetes.io/projected/732d4254-41b9-4098-87cd-223787bf455e-kube-api-access-8ldk4\") on node \"crc\" DevicePath \"\"" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.753354 4731 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/732d4254-41b9-4098-87cd-223787bf455e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.753376 4731 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:28:33 crc kubenswrapper[4731]: I1203 19:28:33.753390 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/732d4254-41b9-4098-87cd-223787bf455e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.179586 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" event={"ID":"732d4254-41b9-4098-87cd-223787bf455e","Type":"ContainerDied","Data":"ecfc6418722098cbec0776eef049ac9656bc2cf3f24ae6c5db2eb57b358a6c22"} Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.179637 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecfc6418722098cbec0776eef049ac9656bc2cf3f24ae6c5db2eb57b358a6c22" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.179840 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6thtx" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.289419 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876"] Dec 03 19:28:34 crc kubenswrapper[4731]: E1203 19:28:34.289904 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732d4254-41b9-4098-87cd-223787bf455e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.289924 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="732d4254-41b9-4098-87cd-223787bf455e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.290126 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="732d4254-41b9-4098-87cd-223787bf455e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.290823 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.300858 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.301093 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.301240 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.301370 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.301460 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.301635 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.305728 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876"] Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.366848 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.367314 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmdf\" (UniqueName: \"kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.367357 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.367418 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.367441 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.367460 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.469770 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.469835 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.469875 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.470031 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.470098 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmdf\" (UniqueName: \"kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.470185 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.475604 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.476411 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.478570 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.479533 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.479563 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.488285 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmdf\" (UniqueName: \"kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:34 crc kubenswrapper[4731]: I1203 19:28:34.655240 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:28:35 crc kubenswrapper[4731]: I1203 19:28:35.193858 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876"] Dec 03 19:28:36 crc kubenswrapper[4731]: I1203 19:28:36.203670 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" event={"ID":"62af9712-5ba8-42f1-ba62-6b6de75e0de6","Type":"ContainerStarted","Data":"c341377966a45f5e2ab69461a21030fcde0fe9f82a4ec5816cb6c755c9342709"} Dec 03 19:28:37 crc kubenswrapper[4731]: I1203 19:28:37.215325 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" event={"ID":"62af9712-5ba8-42f1-ba62-6b6de75e0de6","Type":"ContainerStarted","Data":"68bdd35f8dac06a168a988fdf63b92aecb382f409a7db736a6e5474be9074fdd"} Dec 03 19:28:37 crc kubenswrapper[4731]: I1203 19:28:37.240371 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" podStartSLOduration=2.46646097 podStartE2EDuration="3.240340983s" podCreationTimestamp="2025-12-03 19:28:34 +0000 UTC" firstStartedPulling="2025-12-03 19:28:35.202402983 +0000 UTC m=+2035.800997447" lastFinishedPulling="2025-12-03 19:28:35.976282996 +0000 UTC m=+2036.574877460" observedRunningTime="2025-12-03 19:28:37.239074653 +0000 UTC m=+2037.837669137" watchObservedRunningTime="2025-12-03 19:28:37.240340983 +0000 UTC m=+2037.838935457" Dec 03 19:29:29 crc kubenswrapper[4731]: I1203 19:29:29.778395 4731 generic.go:334] "Generic (PLEG): container finished" podID="62af9712-5ba8-42f1-ba62-6b6de75e0de6" containerID="68bdd35f8dac06a168a988fdf63b92aecb382f409a7db736a6e5474be9074fdd" exitCode=0 Dec 03 19:29:29 crc kubenswrapper[4731]: I1203 19:29:29.778470 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" event={"ID":"62af9712-5ba8-42f1-ba62-6b6de75e0de6","Type":"ContainerDied","Data":"68bdd35f8dac06a168a988fdf63b92aecb382f409a7db736a6e5474be9074fdd"} Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.241568 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316406 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316463 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316720 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316786 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316866 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.316898 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvmdf\" (UniqueName: \"kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf\") pod \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\" (UID: \"62af9712-5ba8-42f1-ba62-6b6de75e0de6\") " Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.324206 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.324527 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf" (OuterVolumeSpecName: "kube-api-access-mvmdf") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "kube-api-access-mvmdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.348243 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.350243 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.366575 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory" (OuterVolumeSpecName: "inventory") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.369098 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "62af9712-5ba8-42f1-ba62-6b6de75e0de6" (UID: "62af9712-5ba8-42f1-ba62-6b6de75e0de6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.420901 4731 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.421219 4731 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.421233 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvmdf\" (UniqueName: \"kubernetes.io/projected/62af9712-5ba8-42f1-ba62-6b6de75e0de6-kube-api-access-mvmdf\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.421245 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.421269 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.421278 4731 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62af9712-5ba8-42f1-ba62-6b6de75e0de6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.803678 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" event={"ID":"62af9712-5ba8-42f1-ba62-6b6de75e0de6","Type":"ContainerDied","Data":"c341377966a45f5e2ab69461a21030fcde0fe9f82a4ec5816cb6c755c9342709"} Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.803747 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c341377966a45f5e2ab69461a21030fcde0fe9f82a4ec5816cb6c755c9342709" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.803839 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.947247 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5"] Dec 03 19:29:31 crc kubenswrapper[4731]: E1203 19:29:31.947761 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62af9712-5ba8-42f1-ba62-6b6de75e0de6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.947780 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="62af9712-5ba8-42f1-ba62-6b6de75e0de6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.947972 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="62af9712-5ba8-42f1-ba62-6b6de75e0de6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.948704 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.953896 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.953994 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.953904 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.953994 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.954286 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:29:31 crc kubenswrapper[4731]: I1203 19:29:31.982324 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5"] Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.033845 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.034051 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.034447 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dwn\" (UniqueName: \"kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.034508 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.034844 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.137200 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.137389 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dwn\" (UniqueName: \"kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.137423 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.137521 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.137593 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.142220 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.142382 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.142920 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.146195 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.157532 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dwn\" (UniqueName: \"kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.273856 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:29:32 crc kubenswrapper[4731]: I1203 19:29:32.819467 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5"] Dec 03 19:29:33 crc kubenswrapper[4731]: I1203 19:29:33.826915 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" event={"ID":"63e88ef4-d82b-4798-b386-8158184b32d4","Type":"ContainerStarted","Data":"b85596f09024ac63616b5bf05afd4f88a0b0ef19966b64f52c8b04c57dfe2811"} Dec 03 19:29:33 crc kubenswrapper[4731]: I1203 19:29:33.827401 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" event={"ID":"63e88ef4-d82b-4798-b386-8158184b32d4","Type":"ContainerStarted","Data":"7b746116e0239c678a24f6f799c7c33cc97c1b8307d4ba1befee3c706a91b6ac"} Dec 03 19:29:33 crc kubenswrapper[4731]: I1203 19:29:33.854677 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" podStartSLOduration=2.335056829 podStartE2EDuration="2.854641565s" podCreationTimestamp="2025-12-03 19:29:31 +0000 UTC" firstStartedPulling="2025-12-03 19:29:32.831423678 +0000 UTC m=+2093.430018142" lastFinishedPulling="2025-12-03 19:29:33.351008404 +0000 UTC m=+2093.949602878" observedRunningTime="2025-12-03 19:29:33.843496568 +0000 UTC m=+2094.442091052" watchObservedRunningTime="2025-12-03 19:29:33.854641565 +0000 UTC m=+2094.453236069" Dec 03 19:29:56 crc kubenswrapper[4731]: I1203 19:29:56.468439 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:29:56 crc kubenswrapper[4731]: I1203 19:29:56.468989 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.142143 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv"] Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.144738 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.147436 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.147540 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.160867 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv"] Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.282276 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwgb\" (UniqueName: \"kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.282327 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.282379 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.384623 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.384813 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwgb\" (UniqueName: \"kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.384842 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.385545 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.394040 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.413007 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwgb\" (UniqueName: \"kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb\") pod \"collect-profiles-29413170-8jqlv\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.499007 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:00 crc kubenswrapper[4731]: I1203 19:30:00.972152 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv"] Dec 03 19:30:01 crc kubenswrapper[4731]: I1203 19:30:01.103101 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" event={"ID":"2fce13ad-818f-4fae-906c-ce7ff951c64f","Type":"ContainerStarted","Data":"37380f65c49814545b1e3a283d6b3f6a696268530ac80fde7d7bf3e4d2b91925"} Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.114129 4731 generic.go:334] "Generic (PLEG): container finished" podID="2fce13ad-818f-4fae-906c-ce7ff951c64f" containerID="e866d16d0b861317e9c1db95b535577f94b5a79679ae3ee7491bf09a5c02c6ab" exitCode=0 Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.114307 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" event={"ID":"2fce13ad-818f-4fae-906c-ce7ff951c64f","Type":"ContainerDied","Data":"e866d16d0b861317e9c1db95b535577f94b5a79679ae3ee7491bf09a5c02c6ab"} Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.797055 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.802578 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.812879 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.833848 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8pl\" (UniqueName: \"kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.834021 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.834052 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.936153 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8pl\" (UniqueName: \"kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.936327 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.936358 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.936933 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.937101 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:02 crc kubenswrapper[4731]: I1203 19:30:02.957796 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8pl\" (UniqueName: \"kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl\") pod \"certified-operators-92zkc\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.131355 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.637210 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.751127 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume\") pod \"2fce13ad-818f-4fae-906c-ce7ff951c64f\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.751195 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwgb\" (UniqueName: \"kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb\") pod \"2fce13ad-818f-4fae-906c-ce7ff951c64f\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.751522 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume\") pod \"2fce13ad-818f-4fae-906c-ce7ff951c64f\" (UID: \"2fce13ad-818f-4fae-906c-ce7ff951c64f\") " Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.751841 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume" (OuterVolumeSpecName: "config-volume") pod "2fce13ad-818f-4fae-906c-ce7ff951c64f" (UID: "2fce13ad-818f-4fae-906c-ce7ff951c64f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.752069 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fce13ad-818f-4fae-906c-ce7ff951c64f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.761218 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb" (OuterVolumeSpecName: "kube-api-access-rqwgb") pod "2fce13ad-818f-4fae-906c-ce7ff951c64f" (UID: "2fce13ad-818f-4fae-906c-ce7ff951c64f"). InnerVolumeSpecName "kube-api-access-rqwgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.763711 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2fce13ad-818f-4fae-906c-ce7ff951c64f" (UID: "2fce13ad-818f-4fae-906c-ce7ff951c64f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:30:03 crc kubenswrapper[4731]: W1203 19:30:03.795188 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb174f58e_e6a1_4c08_96bd_5bbcd6fec3f3.slice/crio-0c019fee2b126dcc20f0149277df943e4524e57e6f83b2bf1566f04dfbc14948 WatchSource:0}: Error finding container 0c019fee2b126dcc20f0149277df943e4524e57e6f83b2bf1566f04dfbc14948: Status 404 returned error can't find the container with id 0c019fee2b126dcc20f0149277df943e4524e57e6f83b2bf1566f04dfbc14948 Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.830795 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.855105 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwgb\" (UniqueName: \"kubernetes.io/projected/2fce13ad-818f-4fae-906c-ce7ff951c64f-kube-api-access-rqwgb\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:03 crc kubenswrapper[4731]: I1203 19:30:03.855146 4731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fce13ad-818f-4fae-906c-ce7ff951c64f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.135552 4731 generic.go:334] "Generic (PLEG): container finished" podID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerID="e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c" exitCode=0 Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.135673 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerDied","Data":"e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c"} Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.135787 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerStarted","Data":"0c019fee2b126dcc20f0149277df943e4524e57e6f83b2bf1566f04dfbc14948"} Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.138845 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" event={"ID":"2fce13ad-818f-4fae-906c-ce7ff951c64f","Type":"ContainerDied","Data":"37380f65c49814545b1e3a283d6b3f6a696268530ac80fde7d7bf3e4d2b91925"} Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.138883 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413170-8jqlv" Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.138888 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37380f65c49814545b1e3a283d6b3f6a696268530ac80fde7d7bf3e4d2b91925" Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.752763 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm"] Dec 03 19:30:04 crc kubenswrapper[4731]: I1203 19:30:04.762240 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413125-g6bsm"] Dec 03 19:30:05 crc kubenswrapper[4731]: I1203 19:30:05.150588 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerStarted","Data":"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8"} Dec 03 19:30:05 crc kubenswrapper[4731]: I1203 19:30:05.870471 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294f0919-18b7-42f8-8528-c3ada8d12d53" path="/var/lib/kubelet/pods/294f0919-18b7-42f8-8528-c3ada8d12d53/volumes" Dec 03 19:30:06 crc kubenswrapper[4731]: I1203 19:30:06.163527 4731 generic.go:334] "Generic (PLEG): container finished" podID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerID="4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8" exitCode=0 Dec 03 19:30:06 crc kubenswrapper[4731]: I1203 19:30:06.163594 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerDied","Data":"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8"} Dec 03 19:30:07 crc kubenswrapper[4731]: I1203 19:30:07.175693 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerStarted","Data":"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e"} Dec 03 19:30:07 crc kubenswrapper[4731]: I1203 19:30:07.193480 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-92zkc" podStartSLOduration=2.649312305 podStartE2EDuration="5.193459296s" podCreationTimestamp="2025-12-03 19:30:02 +0000 UTC" firstStartedPulling="2025-12-03 19:30:04.137082747 +0000 UTC m=+2124.735677201" lastFinishedPulling="2025-12-03 19:30:06.681229728 +0000 UTC m=+2127.279824192" observedRunningTime="2025-12-03 19:30:07.192815876 +0000 UTC m=+2127.791410340" watchObservedRunningTime="2025-12-03 19:30:07.193459296 +0000 UTC m=+2127.792053760" Dec 03 19:30:13 crc kubenswrapper[4731]: I1203 19:30:13.132294 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:13 crc kubenswrapper[4731]: I1203 19:30:13.132824 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:13 crc kubenswrapper[4731]: I1203 19:30:13.189063 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:13 crc kubenswrapper[4731]: I1203 19:30:13.278846 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:13 crc kubenswrapper[4731]: I1203 19:30:13.429660 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.249776 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-92zkc" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="registry-server" containerID="cri-o://c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e" gracePeriod=2 Dec 03 19:30:15 crc kubenswrapper[4731]: E1203 19:30:15.430351 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb174f58e_e6a1_4c08_96bd_5bbcd6fec3f3.slice/crio-c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.792871 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.919056 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content\") pod \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.919117 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities\") pod \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.919243 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n8pl\" (UniqueName: \"kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl\") pod \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\" (UID: \"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3\") " Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.920446 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities" (OuterVolumeSpecName: "utilities") pod "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" (UID: "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.927458 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl" (OuterVolumeSpecName: "kube-api-access-8n8pl") pod "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" (UID: "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3"). InnerVolumeSpecName "kube-api-access-8n8pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:30:15 crc kubenswrapper[4731]: I1203 19:30:15.976501 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" (UID: "b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.021763 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.021818 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.021830 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n8pl\" (UniqueName: \"kubernetes.io/projected/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3-kube-api-access-8n8pl\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.262599 4731 generic.go:334] "Generic (PLEG): container finished" podID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerID="c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e" exitCode=0 Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.262723 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92zkc" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.262717 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerDied","Data":"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e"} Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.263123 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92zkc" event={"ID":"b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3","Type":"ContainerDied","Data":"0c019fee2b126dcc20f0149277df943e4524e57e6f83b2bf1566f04dfbc14948"} Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.263154 4731 scope.go:117] "RemoveContainer" containerID="c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.308598 4731 scope.go:117] "RemoveContainer" containerID="4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.314917 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.330157 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-92zkc"] Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.354120 4731 scope.go:117] "RemoveContainer" containerID="e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.391580 4731 scope.go:117] "RemoveContainer" containerID="c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e" Dec 03 19:30:16 crc kubenswrapper[4731]: E1203 19:30:16.392326 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e\": container with ID starting with c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e not found: ID does not exist" containerID="c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.392444 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e"} err="failed to get container status \"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e\": rpc error: code = NotFound desc = could not find container \"c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e\": container with ID starting with c43361cdf3834e193607c14a24681c7d112aede970ae87e11828f6b8467b3a8e not found: ID does not exist" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.392540 4731 scope.go:117] "RemoveContainer" containerID="4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8" Dec 03 19:30:16 crc kubenswrapper[4731]: E1203 19:30:16.393094 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8\": container with ID starting with 4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8 not found: ID does not exist" containerID="4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.393184 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8"} err="failed to get container status \"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8\": rpc error: code = NotFound desc = could not find container \"4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8\": container with ID starting with 4fd3b1174786ab7964042b0348b115d30a91599eff4164546b6fdd6c93056dc8 not found: ID does not exist" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.393265 4731 scope.go:117] "RemoveContainer" containerID="e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c" Dec 03 19:30:16 crc kubenswrapper[4731]: E1203 19:30:16.393605 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c\": container with ID starting with e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c not found: ID does not exist" containerID="e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c" Dec 03 19:30:16 crc kubenswrapper[4731]: I1203 19:30:16.393685 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c"} err="failed to get container status \"e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c\": rpc error: code = NotFound desc = could not find container \"e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c\": container with ID starting with e6d2b3475b2decbfa919a85dcd39b6261a8d86fe373e7337d63552a70d71320c not found: ID does not exist" Dec 03 19:30:17 crc kubenswrapper[4731]: I1203 19:30:17.866861 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" path="/var/lib/kubelet/pods/b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3/volumes" Dec 03 19:30:20 crc kubenswrapper[4731]: I1203 19:30:20.324714 4731 scope.go:117] "RemoveContainer" containerID="fabaadcaeacad1a67a5f48c25afe1cd2186bd13b04f50e08841e788b0948f72d" Dec 03 19:30:26 crc kubenswrapper[4731]: I1203 19:30:26.468972 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:30:26 crc kubenswrapper[4731]: I1203 19:30:26.471338 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166043 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:27 crc kubenswrapper[4731]: E1203 19:30:27.166482 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fce13ad-818f-4fae-906c-ce7ff951c64f" containerName="collect-profiles" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166496 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fce13ad-818f-4fae-906c-ce7ff951c64f" containerName="collect-profiles" Dec 03 19:30:27 crc kubenswrapper[4731]: E1203 19:30:27.166513 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="extract-content" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166519 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="extract-content" Dec 03 19:30:27 crc kubenswrapper[4731]: E1203 19:30:27.166545 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="extract-utilities" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166552 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="extract-utilities" Dec 03 19:30:27 crc kubenswrapper[4731]: E1203 19:30:27.166568 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="registry-server" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166574 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="registry-server" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166769 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="b174f58e-e6a1-4c08-96bd-5bbcd6fec3f3" containerName="registry-server" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.166787 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fce13ad-818f-4fae-906c-ce7ff951c64f" containerName="collect-profiles" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.168205 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.195110 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.254598 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9pz\" (UniqueName: \"kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.254709 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.254765 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.357355 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9pz\" (UniqueName: \"kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.357428 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.357492 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.358112 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.358584 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.381414 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9pz\" (UniqueName: \"kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz\") pod \"community-operators-vcm7m\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:27 crc kubenswrapper[4731]: I1203 19:30:27.546723 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:28 crc kubenswrapper[4731]: I1203 19:30:28.156757 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:28 crc kubenswrapper[4731]: I1203 19:30:28.409781 4731 generic.go:334] "Generic (PLEG): container finished" podID="e73a2534-b37a-4139-bf95-d0918117a96b" containerID="9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b" exitCode=0 Dec 03 19:30:28 crc kubenswrapper[4731]: I1203 19:30:28.409846 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerDied","Data":"9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b"} Dec 03 19:30:28 crc kubenswrapper[4731]: I1203 19:30:28.409883 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerStarted","Data":"ffb265e9da40ccc87a7897b7aaebcc2014a0d9e8cf1536430fbbc43fd4025d45"} Dec 03 19:30:29 crc kubenswrapper[4731]: I1203 19:30:29.421739 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerStarted","Data":"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6"} Dec 03 19:30:30 crc kubenswrapper[4731]: I1203 19:30:30.433768 4731 generic.go:334] "Generic (PLEG): container finished" podID="e73a2534-b37a-4139-bf95-d0918117a96b" containerID="edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6" exitCode=0 Dec 03 19:30:30 crc kubenswrapper[4731]: I1203 19:30:30.433842 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerDied","Data":"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6"} Dec 03 19:30:31 crc kubenswrapper[4731]: I1203 19:30:31.447560 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerStarted","Data":"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637"} Dec 03 19:30:31 crc kubenswrapper[4731]: I1203 19:30:31.469791 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcm7m" podStartSLOduration=2.025804787 podStartE2EDuration="4.469769676s" podCreationTimestamp="2025-12-03 19:30:27 +0000 UTC" firstStartedPulling="2025-12-03 19:30:28.411786248 +0000 UTC m=+2149.010380712" lastFinishedPulling="2025-12-03 19:30:30.855751147 +0000 UTC m=+2151.454345601" observedRunningTime="2025-12-03 19:30:31.466529017 +0000 UTC m=+2152.065123471" watchObservedRunningTime="2025-12-03 19:30:31.469769676 +0000 UTC m=+2152.068364140" Dec 03 19:30:37 crc kubenswrapper[4731]: I1203 19:30:37.547204 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:37 crc kubenswrapper[4731]: I1203 19:30:37.547744 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:37 crc kubenswrapper[4731]: I1203 19:30:37.594413 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:38 crc kubenswrapper[4731]: I1203 19:30:38.560771 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:38 crc kubenswrapper[4731]: I1203 19:30:38.609012 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:40 crc kubenswrapper[4731]: I1203 19:30:40.535362 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vcm7m" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="registry-server" containerID="cri-o://a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637" gracePeriod=2 Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.502017 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.548569 4731 generic.go:334] "Generic (PLEG): container finished" podID="e73a2534-b37a-4139-bf95-d0918117a96b" containerID="a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637" exitCode=0 Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.548619 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerDied","Data":"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637"} Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.548653 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcm7m" event={"ID":"e73a2534-b37a-4139-bf95-d0918117a96b","Type":"ContainerDied","Data":"ffb265e9da40ccc87a7897b7aaebcc2014a0d9e8cf1536430fbbc43fd4025d45"} Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.548669 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcm7m" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.548676 4731 scope.go:117] "RemoveContainer" containerID="a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.555600 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content\") pod \"e73a2534-b37a-4139-bf95-d0918117a96b\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.555789 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp9pz\" (UniqueName: \"kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz\") pod \"e73a2534-b37a-4139-bf95-d0918117a96b\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.555828 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities\") pod \"e73a2534-b37a-4139-bf95-d0918117a96b\" (UID: \"e73a2534-b37a-4139-bf95-d0918117a96b\") " Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.557378 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities" (OuterVolumeSpecName: "utilities") pod "e73a2534-b37a-4139-bf95-d0918117a96b" (UID: "e73a2534-b37a-4139-bf95-d0918117a96b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.562029 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz" (OuterVolumeSpecName: "kube-api-access-lp9pz") pod "e73a2534-b37a-4139-bf95-d0918117a96b" (UID: "e73a2534-b37a-4139-bf95-d0918117a96b"). InnerVolumeSpecName "kube-api-access-lp9pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.569991 4731 scope.go:117] "RemoveContainer" containerID="edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.623037 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e73a2534-b37a-4139-bf95-d0918117a96b" (UID: "e73a2534-b37a-4139-bf95-d0918117a96b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.627018 4731 scope.go:117] "RemoveContainer" containerID="9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.658580 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.658787 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp9pz\" (UniqueName: \"kubernetes.io/projected/e73a2534-b37a-4139-bf95-d0918117a96b-kube-api-access-lp9pz\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.658829 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e73a2534-b37a-4139-bf95-d0918117a96b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.676674 4731 scope.go:117] "RemoveContainer" containerID="a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637" Dec 03 19:30:41 crc kubenswrapper[4731]: E1203 19:30:41.677343 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637\": container with ID starting with a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637 not found: ID does not exist" containerID="a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.677398 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637"} err="failed to get container status \"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637\": rpc error: code = NotFound desc = could not find container \"a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637\": container with ID starting with a21904fe3f33e7c9282a176c1ad0f37ca701e346db3bbe13bdc86422eaf55637 not found: ID does not exist" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.677424 4731 scope.go:117] "RemoveContainer" containerID="edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6" Dec 03 19:30:41 crc kubenswrapper[4731]: E1203 19:30:41.678054 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6\": container with ID starting with edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6 not found: ID does not exist" containerID="edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.678113 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6"} err="failed to get container status \"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6\": rpc error: code = NotFound desc = could not find container \"edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6\": container with ID starting with edac0136615fed87b7b2f561d340d135c53bc98b79306446c1976c49a9ed78f6 not found: ID does not exist" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.678152 4731 scope.go:117] "RemoveContainer" containerID="9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b" Dec 03 19:30:41 crc kubenswrapper[4731]: E1203 19:30:41.678568 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b\": container with ID starting with 9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b not found: ID does not exist" containerID="9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.678600 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b"} err="failed to get container status \"9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b\": rpc error: code = NotFound desc = could not find container \"9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b\": container with ID starting with 9dda0db2aee5d7a7c72b44bf806e5a9eba33b4bcf773001db9f7b567fb4fae1b not found: ID does not exist" Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.930550 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:41 crc kubenswrapper[4731]: I1203 19:30:41.940945 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vcm7m"] Dec 03 19:30:43 crc kubenswrapper[4731]: I1203 19:30:43.867839 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" path="/var/lib/kubelet/pods/e73a2534-b37a-4139-bf95-d0918117a96b/volumes" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.468095 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.468592 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.468650 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.469625 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.469703 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" gracePeriod=600 Dec 03 19:30:56 crc kubenswrapper[4731]: E1203 19:30:56.607924 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:30:56 crc kubenswrapper[4731]: E1203 19:30:56.612578 4731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95dced4d_3fd5_43d3_b87d_21ec9c80de8b.slice/crio-2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331.scope\": RecentStats: unable to find data in memory cache]" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.691701 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" exitCode=0 Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.691746 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331"} Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.691821 4731 scope.go:117] "RemoveContainer" containerID="7cf6f60cbac1430c262c450bc8ec3373da0e50f736eaecd23c4a7f1ca121d110" Dec 03 19:30:56 crc kubenswrapper[4731]: I1203 19:30:56.692589 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:30:56 crc kubenswrapper[4731]: E1203 19:30:56.692930 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:31:08 crc kubenswrapper[4731]: I1203 19:31:08.858717 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:31:08 crc kubenswrapper[4731]: E1203 19:31:08.859500 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:31:19 crc kubenswrapper[4731]: I1203 19:31:19.858349 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:31:19 crc kubenswrapper[4731]: E1203 19:31:19.860587 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:31:30 crc kubenswrapper[4731]: I1203 19:31:30.857674 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:31:30 crc kubenswrapper[4731]: E1203 19:31:30.859226 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:31:44 crc kubenswrapper[4731]: I1203 19:31:44.855952 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:31:44 crc kubenswrapper[4731]: E1203 19:31:44.856935 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:31:58 crc kubenswrapper[4731]: I1203 19:31:58.856518 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:31:58 crc kubenswrapper[4731]: E1203 19:31:58.857391 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.267895 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:08 crc kubenswrapper[4731]: E1203 19:32:08.269167 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="registry-server" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.269184 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="registry-server" Dec 03 19:32:08 crc kubenswrapper[4731]: E1203 19:32:08.269208 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="extract-utilities" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.269215 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="extract-utilities" Dec 03 19:32:08 crc kubenswrapper[4731]: E1203 19:32:08.269238 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="extract-content" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.269245 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="extract-content" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.269463 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73a2534-b37a-4139-bf95-d0918117a96b" containerName="registry-server" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.274115 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.281115 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.394367 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qvc\" (UniqueName: \"kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.394791 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.395013 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.455701 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.458961 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.468457 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.497024 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.497140 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qvc\" (UniqueName: \"kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.497166 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.497754 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.497848 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.527420 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qvc\" (UniqueName: \"kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc\") pod \"redhat-operators-hzq7s\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.599669 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.599737 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzkr\" (UniqueName: \"kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.599962 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.616043 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.702952 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.703445 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzkr\" (UniqueName: \"kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.703588 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.703607 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.703948 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.723735 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzkr\" (UniqueName: \"kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr\") pod \"redhat-marketplace-jzwzx\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:08 crc kubenswrapper[4731]: I1203 19:32:08.792305 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.153180 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.330134 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:09 crc kubenswrapper[4731]: W1203 19:32:09.331934 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod561d3430_4671_4aab_b191_af5d620ea7fc.slice/crio-b2ff87274102924bdbb88dd2bd9c213ff97016e332a12492d6cdf4373da29de2 WatchSource:0}: Error finding container b2ff87274102924bdbb88dd2bd9c213ff97016e332a12492d6cdf4373da29de2: Status 404 returned error can't find the container with id b2ff87274102924bdbb88dd2bd9c213ff97016e332a12492d6cdf4373da29de2 Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.390287 4731 generic.go:334] "Generic (PLEG): container finished" podID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerID="060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe" exitCode=0 Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.390543 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerDied","Data":"060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe"} Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.390579 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerStarted","Data":"65908a8e2dbeb7b9e1a5d35045f5e59839bdb07972fee3dfbf4d751bf540b5e3"} Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.393175 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerStarted","Data":"b2ff87274102924bdbb88dd2bd9c213ff97016e332a12492d6cdf4373da29de2"} Dec 03 19:32:09 crc kubenswrapper[4731]: I1203 19:32:09.393757 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:32:10 crc kubenswrapper[4731]: I1203 19:32:10.408492 4731 generic.go:334] "Generic (PLEG): container finished" podID="561d3430-4671-4aab-b191-af5d620ea7fc" containerID="d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971" exitCode=0 Dec 03 19:32:10 crc kubenswrapper[4731]: I1203 19:32:10.408578 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerDied","Data":"d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971"} Dec 03 19:32:10 crc kubenswrapper[4731]: I1203 19:32:10.416652 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerStarted","Data":"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718"} Dec 03 19:32:11 crc kubenswrapper[4731]: I1203 19:32:11.430724 4731 generic.go:334] "Generic (PLEG): container finished" podID="561d3430-4671-4aab-b191-af5d620ea7fc" containerID="2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a" exitCode=0 Dec 03 19:32:11 crc kubenswrapper[4731]: I1203 19:32:11.430825 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerDied","Data":"2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a"} Dec 03 19:32:11 crc kubenswrapper[4731]: I1203 19:32:11.434895 4731 generic.go:334] "Generic (PLEG): container finished" podID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerID="5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718" exitCode=0 Dec 03 19:32:11 crc kubenswrapper[4731]: I1203 19:32:11.434931 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerDied","Data":"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718"} Dec 03 19:32:12 crc kubenswrapper[4731]: I1203 19:32:12.471081 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerStarted","Data":"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0"} Dec 03 19:32:12 crc kubenswrapper[4731]: I1203 19:32:12.477112 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerStarted","Data":"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7"} Dec 03 19:32:12 crc kubenswrapper[4731]: I1203 19:32:12.497194 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hzq7s" podStartSLOduration=1.95166588 podStartE2EDuration="4.497166177s" podCreationTimestamp="2025-12-03 19:32:08 +0000 UTC" firstStartedPulling="2025-12-03 19:32:09.393322749 +0000 UTC m=+2249.991917213" lastFinishedPulling="2025-12-03 19:32:11.938823036 +0000 UTC m=+2252.537417510" observedRunningTime="2025-12-03 19:32:12.490569254 +0000 UTC m=+2253.089163728" watchObservedRunningTime="2025-12-03 19:32:12.497166177 +0000 UTC m=+2253.095760641" Dec 03 19:32:12 crc kubenswrapper[4731]: I1203 19:32:12.529150 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jzwzx" podStartSLOduration=3.107316745 podStartE2EDuration="4.529115227s" podCreationTimestamp="2025-12-03 19:32:08 +0000 UTC" firstStartedPulling="2025-12-03 19:32:10.410867174 +0000 UTC m=+2251.009461648" lastFinishedPulling="2025-12-03 19:32:11.832665666 +0000 UTC m=+2252.431260130" observedRunningTime="2025-12-03 19:32:12.521140473 +0000 UTC m=+2253.119734947" watchObservedRunningTime="2025-12-03 19:32:12.529115227 +0000 UTC m=+2253.127709701" Dec 03 19:32:13 crc kubenswrapper[4731]: I1203 19:32:13.856831 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:32:13 crc kubenswrapper[4731]: E1203 19:32:13.857714 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.617583 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.618891 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.688063 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.792627 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.793944 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:18 crc kubenswrapper[4731]: I1203 19:32:18.883087 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:19 crc kubenswrapper[4731]: I1203 19:32:19.595894 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:19 crc kubenswrapper[4731]: I1203 19:32:19.611954 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:20 crc kubenswrapper[4731]: I1203 19:32:20.125390 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:21 crc kubenswrapper[4731]: I1203 19:32:21.559896 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hzq7s" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="registry-server" containerID="cri-o://a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0" gracePeriod=2 Dec 03 19:32:21 crc kubenswrapper[4731]: I1203 19:32:21.931994 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.059023 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.154138 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content\") pod \"bd80e651-b01a-4bbd-a306-bab85163fda4\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.154565 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities\") pod \"bd80e651-b01a-4bbd-a306-bab85163fda4\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.154686 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98qvc\" (UniqueName: \"kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc\") pod \"bd80e651-b01a-4bbd-a306-bab85163fda4\" (UID: \"bd80e651-b01a-4bbd-a306-bab85163fda4\") " Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.157692 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities" (OuterVolumeSpecName: "utilities") pod "bd80e651-b01a-4bbd-a306-bab85163fda4" (UID: "bd80e651-b01a-4bbd-a306-bab85163fda4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.170609 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc" (OuterVolumeSpecName: "kube-api-access-98qvc") pod "bd80e651-b01a-4bbd-a306-bab85163fda4" (UID: "bd80e651-b01a-4bbd-a306-bab85163fda4"). InnerVolumeSpecName "kube-api-access-98qvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.257490 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.257535 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98qvc\" (UniqueName: \"kubernetes.io/projected/bd80e651-b01a-4bbd-a306-bab85163fda4-kube-api-access-98qvc\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.278392 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd80e651-b01a-4bbd-a306-bab85163fda4" (UID: "bd80e651-b01a-4bbd-a306-bab85163fda4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.359673 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd80e651-b01a-4bbd-a306-bab85163fda4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571378 4731 generic.go:334] "Generic (PLEG): container finished" podID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerID="a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0" exitCode=0 Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571431 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerDied","Data":"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0"} Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571803 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzq7s" event={"ID":"bd80e651-b01a-4bbd-a306-bab85163fda4","Type":"ContainerDied","Data":"65908a8e2dbeb7b9e1a5d35045f5e59839bdb07972fee3dfbf4d751bf540b5e3"} Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571497 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzq7s" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571859 4731 scope.go:117] "RemoveContainer" containerID="a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.571935 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jzwzx" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="registry-server" containerID="cri-o://9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7" gracePeriod=2 Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.605526 4731 scope.go:117] "RemoveContainer" containerID="5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.625240 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.634017 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hzq7s"] Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.656662 4731 scope.go:117] "RemoveContainer" containerID="060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.805482 4731 scope.go:117] "RemoveContainer" containerID="a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0" Dec 03 19:32:22 crc kubenswrapper[4731]: E1203 19:32:22.806944 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0\": container with ID starting with a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0 not found: ID does not exist" containerID="a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.806982 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0"} err="failed to get container status \"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0\": rpc error: code = NotFound desc = could not find container \"a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0\": container with ID starting with a473e130d42fa08b8e023e9dd951d8677908958cfb1676a72b296ed1bc811ba0 not found: ID does not exist" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.807007 4731 scope.go:117] "RemoveContainer" containerID="5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718" Dec 03 19:32:22 crc kubenswrapper[4731]: E1203 19:32:22.807205 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718\": container with ID starting with 5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718 not found: ID does not exist" containerID="5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.807229 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718"} err="failed to get container status \"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718\": rpc error: code = NotFound desc = could not find container \"5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718\": container with ID starting with 5897cbac897c1123a5fee268e39904bde1b433fa061cce8003b9de404c9c3718 not found: ID does not exist" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.807242 4731 scope.go:117] "RemoveContainer" containerID="060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe" Dec 03 19:32:22 crc kubenswrapper[4731]: E1203 19:32:22.807452 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe\": container with ID starting with 060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe not found: ID does not exist" containerID="060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe" Dec 03 19:32:22 crc kubenswrapper[4731]: I1203 19:32:22.807504 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe"} err="failed to get container status \"060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe\": rpc error: code = NotFound desc = could not find container \"060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe\": container with ID starting with 060a2d14a94810e13df4c0085ae275110d04a36ce5c51b7e4458cc2b0956d3fe not found: ID does not exist" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.138876 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.278974 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzkr\" (UniqueName: \"kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr\") pod \"561d3430-4671-4aab-b191-af5d620ea7fc\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.279051 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities\") pod \"561d3430-4671-4aab-b191-af5d620ea7fc\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.279092 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content\") pod \"561d3430-4671-4aab-b191-af5d620ea7fc\" (UID: \"561d3430-4671-4aab-b191-af5d620ea7fc\") " Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.280115 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities" (OuterVolumeSpecName: "utilities") pod "561d3430-4671-4aab-b191-af5d620ea7fc" (UID: "561d3430-4671-4aab-b191-af5d620ea7fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.289143 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr" (OuterVolumeSpecName: "kube-api-access-dpzkr") pod "561d3430-4671-4aab-b191-af5d620ea7fc" (UID: "561d3430-4671-4aab-b191-af5d620ea7fc"). InnerVolumeSpecName "kube-api-access-dpzkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.304515 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "561d3430-4671-4aab-b191-af5d620ea7fc" (UID: "561d3430-4671-4aab-b191-af5d620ea7fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.381747 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzkr\" (UniqueName: \"kubernetes.io/projected/561d3430-4671-4aab-b191-af5d620ea7fc-kube-api-access-dpzkr\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.381782 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.381812 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/561d3430-4671-4aab-b191-af5d620ea7fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.585364 4731 generic.go:334] "Generic (PLEG): container finished" podID="561d3430-4671-4aab-b191-af5d620ea7fc" containerID="9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7" exitCode=0 Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.585458 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerDied","Data":"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7"} Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.586437 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzwzx" event={"ID":"561d3430-4671-4aab-b191-af5d620ea7fc","Type":"ContainerDied","Data":"b2ff87274102924bdbb88dd2bd9c213ff97016e332a12492d6cdf4373da29de2"} Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.586478 4731 scope.go:117] "RemoveContainer" containerID="9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.585488 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzwzx" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.617811 4731 scope.go:117] "RemoveContainer" containerID="2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.636220 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.649406 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzwzx"] Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.649784 4731 scope.go:117] "RemoveContainer" containerID="d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.675582 4731 scope.go:117] "RemoveContainer" containerID="9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7" Dec 03 19:32:23 crc kubenswrapper[4731]: E1203 19:32:23.676181 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7\": container with ID starting with 9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7 not found: ID does not exist" containerID="9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.676236 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7"} err="failed to get container status \"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7\": rpc error: code = NotFound desc = could not find container \"9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7\": container with ID starting with 9150bb583cf5c280f14fcb5ae8e0041d6751ceffeedaf1d498a6db8dab5b7ec7 not found: ID does not exist" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.676278 4731 scope.go:117] "RemoveContainer" containerID="2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a" Dec 03 19:32:23 crc kubenswrapper[4731]: E1203 19:32:23.676619 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a\": container with ID starting with 2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a not found: ID does not exist" containerID="2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.676672 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a"} err="failed to get container status \"2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a\": rpc error: code = NotFound desc = could not find container \"2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a\": container with ID starting with 2f2eaeb131616c2a162f279a9873dbb2ee97ed47749f15e5dca00607d258159a not found: ID does not exist" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.676720 4731 scope.go:117] "RemoveContainer" containerID="d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971" Dec 03 19:32:23 crc kubenswrapper[4731]: E1203 19:32:23.677066 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971\": container with ID starting with d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971 not found: ID does not exist" containerID="d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.677093 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971"} err="failed to get container status \"d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971\": rpc error: code = NotFound desc = could not find container \"d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971\": container with ID starting with d322fa036502585390200201bce1f1b34c0f017eb4c7e6f8530b8e409fb6e971 not found: ID does not exist" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.876844 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" path="/var/lib/kubelet/pods/561d3430-4671-4aab-b191-af5d620ea7fc/volumes" Dec 03 19:32:23 crc kubenswrapper[4731]: I1203 19:32:23.879136 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" path="/var/lib/kubelet/pods/bd80e651-b01a-4bbd-a306-bab85163fda4/volumes" Dec 03 19:32:24 crc kubenswrapper[4731]: I1203 19:32:24.857745 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:32:24 crc kubenswrapper[4731]: E1203 19:32:24.858194 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:32:38 crc kubenswrapper[4731]: I1203 19:32:38.857142 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:32:38 crc kubenswrapper[4731]: E1203 19:32:38.859704 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:32:50 crc kubenswrapper[4731]: I1203 19:32:50.856663 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:32:50 crc kubenswrapper[4731]: E1203 19:32:50.857879 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:05 crc kubenswrapper[4731]: I1203 19:33:05.857738 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:33:05 crc kubenswrapper[4731]: E1203 19:33:05.858489 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:19 crc kubenswrapper[4731]: I1203 19:33:19.863471 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:33:19 crc kubenswrapper[4731]: E1203 19:33:19.864293 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:31 crc kubenswrapper[4731]: I1203 19:33:31.856737 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:33:31 crc kubenswrapper[4731]: E1203 19:33:31.857486 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:44 crc kubenswrapper[4731]: I1203 19:33:44.857165 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:33:44 crc kubenswrapper[4731]: E1203 19:33:44.858033 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:55 crc kubenswrapper[4731]: I1203 19:33:55.857615 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:33:55 crc kubenswrapper[4731]: E1203 19:33:55.858439 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:33:58 crc kubenswrapper[4731]: I1203 19:33:58.574215 4731 generic.go:334] "Generic (PLEG): container finished" podID="63e88ef4-d82b-4798-b386-8158184b32d4" containerID="b85596f09024ac63616b5bf05afd4f88a0b0ef19966b64f52c8b04c57dfe2811" exitCode=0 Dec 03 19:33:58 crc kubenswrapper[4731]: I1203 19:33:58.574312 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" event={"ID":"63e88ef4-d82b-4798-b386-8158184b32d4","Type":"ContainerDied","Data":"b85596f09024ac63616b5bf05afd4f88a0b0ef19966b64f52c8b04c57dfe2811"} Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.121303 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.147930 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle\") pod \"63e88ef4-d82b-4798-b386-8158184b32d4\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.148058 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8dwn\" (UniqueName: \"kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn\") pod \"63e88ef4-d82b-4798-b386-8158184b32d4\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.149131 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory\") pod \"63e88ef4-d82b-4798-b386-8158184b32d4\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.149186 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key\") pod \"63e88ef4-d82b-4798-b386-8158184b32d4\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.149218 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0\") pod \"63e88ef4-d82b-4798-b386-8158184b32d4\" (UID: \"63e88ef4-d82b-4798-b386-8158184b32d4\") " Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.191013 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "63e88ef4-d82b-4798-b386-8158184b32d4" (UID: "63e88ef4-d82b-4798-b386-8158184b32d4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.191546 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn" (OuterVolumeSpecName: "kube-api-access-b8dwn") pod "63e88ef4-d82b-4798-b386-8158184b32d4" (UID: "63e88ef4-d82b-4798-b386-8158184b32d4"). InnerVolumeSpecName "kube-api-access-b8dwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.197971 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "63e88ef4-d82b-4798-b386-8158184b32d4" (UID: "63e88ef4-d82b-4798-b386-8158184b32d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.198269 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory" (OuterVolumeSpecName: "inventory") pod "63e88ef4-d82b-4798-b386-8158184b32d4" (UID: "63e88ef4-d82b-4798-b386-8158184b32d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.199698 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "63e88ef4-d82b-4798-b386-8158184b32d4" (UID: "63e88ef4-d82b-4798-b386-8158184b32d4"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.251179 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.251218 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.251230 4731 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.251335 4731 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e88ef4-d82b-4798-b386-8158184b32d4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.251349 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8dwn\" (UniqueName: \"kubernetes.io/projected/63e88ef4-d82b-4798-b386-8158184b32d4-kube-api-access-b8dwn\") on node \"crc\" DevicePath \"\"" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.602670 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" event={"ID":"63e88ef4-d82b-4798-b386-8158184b32d4","Type":"ContainerDied","Data":"7b746116e0239c678a24f6f799c7c33cc97c1b8307d4ba1befee3c706a91b6ac"} Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.602719 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b746116e0239c678a24f6f799c7c33cc97c1b8307d4ba1befee3c706a91b6ac" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.602738 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716226 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z"] Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716737 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e88ef4-d82b-4798-b386-8158184b32d4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716764 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e88ef4-d82b-4798-b386-8158184b32d4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716784 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716793 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716827 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716834 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716848 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="extract-content" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716856 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="extract-content" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716872 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="extract-utilities" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716879 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="extract-utilities" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716917 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="extract-utilities" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716925 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="extract-utilities" Dec 03 19:34:00 crc kubenswrapper[4731]: E1203 19:34:00.716935 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="extract-content" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.716942 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="extract-content" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.717228 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e88ef4-d82b-4798-b386-8158184b32d4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.717279 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd80e651-b01a-4bbd-a306-bab85163fda4" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.717299 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="561d3430-4671-4aab-b191-af5d620ea7fc" containerName="registry-server" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.718219 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.725170 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.727930 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.728384 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.728456 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.728456 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.728465 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.730371 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.738173 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z"] Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.770104 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.770538 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.770701 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgk2b\" (UniqueName: \"kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.770793 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.770907 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.771009 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.771105 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.771189 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.771303 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872714 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872761 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872788 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872810 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872843 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872897 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872943 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgk2b\" (UniqueName: \"kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.872963 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.873008 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.875433 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.877839 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.877929 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.878177 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.878576 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.879461 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.881209 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.886287 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:00 crc kubenswrapper[4731]: I1203 19:34:00.892449 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgk2b\" (UniqueName: \"kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bdl5z\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:01 crc kubenswrapper[4731]: I1203 19:34:01.042730 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:34:01 crc kubenswrapper[4731]: I1203 19:34:01.569834 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z"] Dec 03 19:34:01 crc kubenswrapper[4731]: I1203 19:34:01.612806 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" event={"ID":"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac","Type":"ContainerStarted","Data":"69ad5cef2b0d3c1dec90314b799a3dc7cdf89a8bf193f2e99ea7661b91f731f7"} Dec 03 19:34:02 crc kubenswrapper[4731]: I1203 19:34:02.632223 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" event={"ID":"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac","Type":"ContainerStarted","Data":"6496912acf9f5f0f6509c01a32667ee843ea8a223b1bb8dd51d572ef8ccda6d8"} Dec 03 19:34:08 crc kubenswrapper[4731]: I1203 19:34:08.856493 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:34:08 crc kubenswrapper[4731]: E1203 19:34:08.857399 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:34:22 crc kubenswrapper[4731]: I1203 19:34:22.856022 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:34:22 crc kubenswrapper[4731]: E1203 19:34:22.857018 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:34:33 crc kubenswrapper[4731]: I1203 19:34:33.858288 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:34:33 crc kubenswrapper[4731]: E1203 19:34:33.859116 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:34:45 crc kubenswrapper[4731]: I1203 19:34:45.856517 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:34:45 crc kubenswrapper[4731]: E1203 19:34:45.857682 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:34:56 crc kubenswrapper[4731]: I1203 19:34:56.856610 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:34:56 crc kubenswrapper[4731]: E1203 19:34:56.857718 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:35:11 crc kubenswrapper[4731]: I1203 19:35:11.857226 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:35:11 crc kubenswrapper[4731]: E1203 19:35:11.858642 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:35:26 crc kubenswrapper[4731]: I1203 19:35:26.856925 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:35:26 crc kubenswrapper[4731]: E1203 19:35:26.857889 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:35:39 crc kubenswrapper[4731]: I1203 19:35:39.864858 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:35:39 crc kubenswrapper[4731]: E1203 19:35:39.866009 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:35:51 crc kubenswrapper[4731]: I1203 19:35:51.856699 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:35:51 crc kubenswrapper[4731]: E1203 19:35:51.858702 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:36:06 crc kubenswrapper[4731]: I1203 19:36:06.856414 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:36:08 crc kubenswrapper[4731]: I1203 19:36:08.042298 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3"} Dec 03 19:36:08 crc kubenswrapper[4731]: I1203 19:36:08.074646 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" podStartSLOduration=127.528230384 podStartE2EDuration="2m8.074591716s" podCreationTimestamp="2025-12-03 19:34:00 +0000 UTC" firstStartedPulling="2025-12-03 19:34:01.573525642 +0000 UTC m=+2362.172120106" lastFinishedPulling="2025-12-03 19:34:02.119886964 +0000 UTC m=+2362.718481438" observedRunningTime="2025-12-03 19:34:02.651669438 +0000 UTC m=+2363.250263902" watchObservedRunningTime="2025-12-03 19:36:08.074591716 +0000 UTC m=+2488.673186190" Dec 03 19:36:58 crc kubenswrapper[4731]: I1203 19:36:58.527744 4731 generic.go:334] "Generic (PLEG): container finished" podID="dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" containerID="6496912acf9f5f0f6509c01a32667ee843ea8a223b1bb8dd51d572ef8ccda6d8" exitCode=0 Dec 03 19:36:58 crc kubenswrapper[4731]: I1203 19:36:58.527844 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" event={"ID":"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac","Type":"ContainerDied","Data":"6496912acf9f5f0f6509c01a32667ee843ea8a223b1bb8dd51d572ef8ccda6d8"} Dec 03 19:36:59 crc kubenswrapper[4731]: I1203 19:36:59.991836 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.106241 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.106939 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107028 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107085 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107104 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107158 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgk2b\" (UniqueName: \"kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107222 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107301 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.107333 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1\") pod \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\" (UID: \"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac\") " Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.113427 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b" (OuterVolumeSpecName: "kube-api-access-cgk2b") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "kube-api-access-cgk2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.114407 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.137028 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.138355 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.143224 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.143803 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.145537 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory" (OuterVolumeSpecName: "inventory") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.145981 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.152318 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" (UID: "dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209567 4731 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209611 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209621 4731 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209630 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgk2b\" (UniqueName: \"kubernetes.io/projected/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-kube-api-access-cgk2b\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209642 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209653 4731 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209662 4731 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209672 4731 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.209680 4731 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.549787 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" event={"ID":"dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac","Type":"ContainerDied","Data":"69ad5cef2b0d3c1dec90314b799a3dc7cdf89a8bf193f2e99ea7661b91f731f7"} Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.550288 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ad5cef2b0d3c1dec90314b799a3dc7cdf89a8bf193f2e99ea7661b91f731f7" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.549852 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bdl5z" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.658147 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj"] Dec 03 19:37:00 crc kubenswrapper[4731]: E1203 19:37:00.658825 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.658892 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.659135 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.660068 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.663606 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.663816 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.663838 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.663931 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.665415 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h52jj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.669660 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj"] Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.822431 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.822581 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.822718 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74sn\" (UniqueName: \"kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.822950 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.823094 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.823129 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.823153 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925404 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925479 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925507 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925539 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925583 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925675 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.925756 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74sn\" (UniqueName: \"kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.930173 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.930295 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.930688 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.930766 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.937512 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.937771 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.946209 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74sn\" (UniqueName: \"kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wsklj\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:00 crc kubenswrapper[4731]: I1203 19:37:00.992574 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:37:01 crc kubenswrapper[4731]: I1203 19:37:01.333081 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj"] Dec 03 19:37:01 crc kubenswrapper[4731]: I1203 19:37:01.559692 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" event={"ID":"b0b0f663-169b-4270-be40-1b2dfab89560","Type":"ContainerStarted","Data":"15649a3a6f696d8a2ad499d6d38975c1e21eedf6d8cd0d53eaf4d4122181a6c6"} Dec 03 19:37:02 crc kubenswrapper[4731]: I1203 19:37:02.572315 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" event={"ID":"b0b0f663-169b-4270-be40-1b2dfab89560","Type":"ContainerStarted","Data":"f8fb9c491d4252aa06bfe972ee598ccc18e09f650b7331792cac9d6c084e0f6c"} Dec 03 19:37:02 crc kubenswrapper[4731]: I1203 19:37:02.601004 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" podStartSLOduration=1.999881241 podStartE2EDuration="2.600985324s" podCreationTimestamp="2025-12-03 19:37:00 +0000 UTC" firstStartedPulling="2025-12-03 19:37:01.341075343 +0000 UTC m=+2541.939669807" lastFinishedPulling="2025-12-03 19:37:01.942179426 +0000 UTC m=+2542.540773890" observedRunningTime="2025-12-03 19:37:02.59598381 +0000 UTC m=+2543.194578274" watchObservedRunningTime="2025-12-03 19:37:02.600985324 +0000 UTC m=+2543.199579788" Dec 03 19:38:26 crc kubenswrapper[4731]: I1203 19:38:26.469349 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:38:26 crc kubenswrapper[4731]: I1203 19:38:26.470643 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:38:56 crc kubenswrapper[4731]: I1203 19:38:56.468493 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:38:56 crc kubenswrapper[4731]: I1203 19:38:56.469067 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:39:26 crc kubenswrapper[4731]: I1203 19:39:26.468721 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:39:26 crc kubenswrapper[4731]: I1203 19:39:26.469415 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:39:26 crc kubenswrapper[4731]: I1203 19:39:26.469479 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:39:26 crc kubenswrapper[4731]: I1203 19:39:26.470536 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:39:26 crc kubenswrapper[4731]: I1203 19:39:26.470609 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3" gracePeriod=600 Dec 03 19:39:27 crc kubenswrapper[4731]: I1203 19:39:27.014128 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3" exitCode=0 Dec 03 19:39:27 crc kubenswrapper[4731]: I1203 19:39:27.014179 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3"} Dec 03 19:39:27 crc kubenswrapper[4731]: I1203 19:39:27.014655 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48"} Dec 03 19:39:27 crc kubenswrapper[4731]: I1203 19:39:27.014712 4731 scope.go:117] "RemoveContainer" containerID="2eb7bf4f80f2ab3f34ba314be8bb2107f2a17ff3eb01ff40bbe892f939082331" Dec 03 19:39:28 crc kubenswrapper[4731]: I1203 19:39:28.029854 4731 generic.go:334] "Generic (PLEG): container finished" podID="b0b0f663-169b-4270-be40-1b2dfab89560" containerID="f8fb9c491d4252aa06bfe972ee598ccc18e09f650b7331792cac9d6c084e0f6c" exitCode=0 Dec 03 19:39:28 crc kubenswrapper[4731]: I1203 19:39:28.029947 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" event={"ID":"b0b0f663-169b-4270-be40-1b2dfab89560","Type":"ContainerDied","Data":"f8fb9c491d4252aa06bfe972ee598ccc18e09f650b7331792cac9d6c084e0f6c"} Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.541719 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.614719 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.614806 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74sn\" (UniqueName: \"kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.615001 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.615053 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.615152 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.615247 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.615304 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0\") pod \"b0b0f663-169b-4270-be40-1b2dfab89560\" (UID: \"b0b0f663-169b-4270-be40-1b2dfab89560\") " Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.625703 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.625812 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn" (OuterVolumeSpecName: "kube-api-access-r74sn") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "kube-api-access-r74sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.648003 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.652016 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.652033 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.656728 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory" (OuterVolumeSpecName: "inventory") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.657451 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0b0f663-169b-4270-be40-1b2dfab89560" (UID: "b0b0f663-169b-4270-be40-1b2dfab89560"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719311 4731 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719350 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719360 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719371 4731 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719380 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74sn\" (UniqueName: \"kubernetes.io/projected/b0b0f663-169b-4270-be40-1b2dfab89560-kube-api-access-r74sn\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719390 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:29 crc kubenswrapper[4731]: I1203 19:39:29.719400 4731 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0b0f663-169b-4270-be40-1b2dfab89560-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 19:39:30 crc kubenswrapper[4731]: I1203 19:39:30.075226 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" event={"ID":"b0b0f663-169b-4270-be40-1b2dfab89560","Type":"ContainerDied","Data":"15649a3a6f696d8a2ad499d6d38975c1e21eedf6d8cd0d53eaf4d4122181a6c6"} Dec 03 19:39:30 crc kubenswrapper[4731]: I1203 19:39:30.075300 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15649a3a6f696d8a2ad499d6d38975c1e21eedf6d8cd0d53eaf4d4122181a6c6" Dec 03 19:39:30 crc kubenswrapper[4731]: I1203 19:39:30.075377 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wsklj" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.768709 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:02 crc kubenswrapper[4731]: E1203 19:40:02.769779 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b0f663-169b-4270-be40-1b2dfab89560" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.769798 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b0f663-169b-4270-be40-1b2dfab89560" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.770045 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b0f663-169b-4270-be40-1b2dfab89560" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.771886 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.781621 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.865386 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.865473 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96thz\" (UniqueName: \"kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.865640 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.967362 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.967478 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96thz\" (UniqueName: \"kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.967572 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.968084 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.968653 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:02 crc kubenswrapper[4731]: I1203 19:40:02.995280 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96thz\" (UniqueName: \"kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz\") pod \"certified-operators-4nzvf\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:03 crc kubenswrapper[4731]: I1203 19:40:03.107311 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:03 crc kubenswrapper[4731]: I1203 19:40:03.653113 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:04 crc kubenswrapper[4731]: I1203 19:40:04.526132 4731 generic.go:334] "Generic (PLEG): container finished" podID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerID="3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1" exitCode=0 Dec 03 19:40:04 crc kubenswrapper[4731]: I1203 19:40:04.526290 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerDied","Data":"3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1"} Dec 03 19:40:04 crc kubenswrapper[4731]: I1203 19:40:04.526559 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerStarted","Data":"24cadc6418fb34803140945b85e6caf209538934fa8a4524f12029d08a1a4885"} Dec 03 19:40:04 crc kubenswrapper[4731]: I1203 19:40:04.529455 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:40:06 crc kubenswrapper[4731]: I1203 19:40:06.551878 4731 generic.go:334] "Generic (PLEG): container finished" podID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerID="fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5" exitCode=0 Dec 03 19:40:06 crc kubenswrapper[4731]: I1203 19:40:06.551941 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerDied","Data":"fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5"} Dec 03 19:40:07 crc kubenswrapper[4731]: I1203 19:40:07.565328 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerStarted","Data":"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa"} Dec 03 19:40:07 crc kubenswrapper[4731]: I1203 19:40:07.594621 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4nzvf" podStartSLOduration=3.170832666 podStartE2EDuration="5.59458321s" podCreationTimestamp="2025-12-03 19:40:02 +0000 UTC" firstStartedPulling="2025-12-03 19:40:04.529144404 +0000 UTC m=+2725.127738868" lastFinishedPulling="2025-12-03 19:40:06.952894948 +0000 UTC m=+2727.551489412" observedRunningTime="2025-12-03 19:40:07.582703934 +0000 UTC m=+2728.181298398" watchObservedRunningTime="2025-12-03 19:40:07.59458321 +0000 UTC m=+2728.193177694" Dec 03 19:40:13 crc kubenswrapper[4731]: I1203 19:40:13.108784 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:13 crc kubenswrapper[4731]: I1203 19:40:13.109582 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:13 crc kubenswrapper[4731]: I1203 19:40:13.254836 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:13 crc kubenswrapper[4731]: I1203 19:40:13.673187 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:13 crc kubenswrapper[4731]: I1203 19:40:13.726003 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:15 crc kubenswrapper[4731]: I1203 19:40:15.638763 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4nzvf" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="registry-server" containerID="cri-o://a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa" gracePeriod=2 Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.171704 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.288229 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content\") pod \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.288504 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96thz\" (UniqueName: \"kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz\") pod \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.288615 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities\") pod \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\" (UID: \"cb93f560-6aa7-4afd-ae7e-2470b3e43440\") " Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.289772 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities" (OuterVolumeSpecName: "utilities") pod "cb93f560-6aa7-4afd-ae7e-2470b3e43440" (UID: "cb93f560-6aa7-4afd-ae7e-2470b3e43440"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.307617 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz" (OuterVolumeSpecName: "kube-api-access-96thz") pod "cb93f560-6aa7-4afd-ae7e-2470b3e43440" (UID: "cb93f560-6aa7-4afd-ae7e-2470b3e43440"). InnerVolumeSpecName "kube-api-access-96thz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.390332 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb93f560-6aa7-4afd-ae7e-2470b3e43440" (UID: "cb93f560-6aa7-4afd-ae7e-2470b3e43440"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.391006 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96thz\" (UniqueName: \"kubernetes.io/projected/cb93f560-6aa7-4afd-ae7e-2470b3e43440-kube-api-access-96thz\") on node \"crc\" DevicePath \"\"" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.391041 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.391050 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb93f560-6aa7-4afd-ae7e-2470b3e43440-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.652544 4731 generic.go:334] "Generic (PLEG): container finished" podID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerID="a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa" exitCode=0 Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.652642 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nzvf" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.652642 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerDied","Data":"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa"} Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.654063 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nzvf" event={"ID":"cb93f560-6aa7-4afd-ae7e-2470b3e43440","Type":"ContainerDied","Data":"24cadc6418fb34803140945b85e6caf209538934fa8a4524f12029d08a1a4885"} Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.654094 4731 scope.go:117] "RemoveContainer" containerID="a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.678174 4731 scope.go:117] "RemoveContainer" containerID="fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.709010 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.718096 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4nzvf"] Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.723769 4731 scope.go:117] "RemoveContainer" containerID="3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.755091 4731 scope.go:117] "RemoveContainer" containerID="a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa" Dec 03 19:40:16 crc kubenswrapper[4731]: E1203 19:40:16.755810 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa\": container with ID starting with a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa not found: ID does not exist" containerID="a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.755929 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa"} err="failed to get container status \"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa\": rpc error: code = NotFound desc = could not find container \"a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa\": container with ID starting with a855eb6f544a37ee462e8c53554146b6be8dcc3fb903a696c94cc9b4fb0325aa not found: ID does not exist" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.756034 4731 scope.go:117] "RemoveContainer" containerID="fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5" Dec 03 19:40:16 crc kubenswrapper[4731]: E1203 19:40:16.756752 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5\": container with ID starting with fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5 not found: ID does not exist" containerID="fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.756788 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5"} err="failed to get container status \"fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5\": rpc error: code = NotFound desc = could not find container \"fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5\": container with ID starting with fda3e4035b1b3cd205fb775cf8dc4134dd912bc4fd68fd1926579fc299e72fb5 not found: ID does not exist" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.756815 4731 scope.go:117] "RemoveContainer" containerID="3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1" Dec 03 19:40:16 crc kubenswrapper[4731]: E1203 19:40:16.757220 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1\": container with ID starting with 3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1 not found: ID does not exist" containerID="3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1" Dec 03 19:40:16 crc kubenswrapper[4731]: I1203 19:40:16.757242 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1"} err="failed to get container status \"3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1\": rpc error: code = NotFound desc = could not find container \"3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1\": container with ID starting with 3021d78844a92e7cb1719f0cf443f44841e8a9ae9f3bf7d1eeae823476ad1ee1 not found: ID does not exist" Dec 03 19:40:17 crc kubenswrapper[4731]: I1203 19:40:17.878171 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" path="/var/lib/kubelet/pods/cb93f560-6aa7-4afd-ae7e-2470b3e43440/volumes" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.310215 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 19:40:26 crc kubenswrapper[4731]: E1203 19:40:26.311274 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="registry-server" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.311294 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="registry-server" Dec 03 19:40:26 crc kubenswrapper[4731]: E1203 19:40:26.311333 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="extract-utilities" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.311341 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="extract-utilities" Dec 03 19:40:26 crc kubenswrapper[4731]: E1203 19:40:26.311402 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="extract-content" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.311410 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="extract-content" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.311622 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb93f560-6aa7-4afd-ae7e-2470b3e43440" containerName="registry-server" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.312450 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.314547 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.314570 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ldvz9" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.314775 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.315425 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.339629 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.399553 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.399609 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.399880 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501707 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501769 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501802 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501832 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501903 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmr7w\" (UniqueName: \"kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501945 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.501963 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.502041 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.502060 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.503137 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.504413 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.511392 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.604414 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.604738 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.604890 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.605169 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmr7w\" (UniqueName: \"kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.605368 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.605401 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.606008 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.606027 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.607213 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.612124 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.613349 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.622632 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmr7w\" (UniqueName: \"kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.635559 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " pod="openstack/tempest-tests-tempest" Dec 03 19:40:26 crc kubenswrapper[4731]: I1203 19:40:26.664380 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 19:40:27 crc kubenswrapper[4731]: I1203 19:40:27.128505 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 19:40:27 crc kubenswrapper[4731]: I1203 19:40:27.770894 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59685196-efa1-464f-b297-b3f23d53e46d","Type":"ContainerStarted","Data":"de547166820c31074a0d567e5c8b7b7801827ad1b392f8c20228a922f2cd6301"} Dec 03 19:40:59 crc kubenswrapper[4731]: E1203 19:40:59.795804 4731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 19:40:59 crc kubenswrapper[4731]: E1203 19:40:59.796537 4731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmr7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(59685196-efa1-464f-b297-b3f23d53e46d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 19:40:59 crc kubenswrapper[4731]: E1203 19:40:59.797753 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="59685196-efa1-464f-b297-b3f23d53e46d" Dec 03 19:41:00 crc kubenswrapper[4731]: E1203 19:41:00.140509 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="59685196-efa1-464f-b297-b3f23d53e46d" Dec 03 19:41:15 crc kubenswrapper[4731]: I1203 19:41:15.525988 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 19:41:17 crc kubenswrapper[4731]: I1203 19:41:17.308090 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59685196-efa1-464f-b297-b3f23d53e46d","Type":"ContainerStarted","Data":"a52f1a0f1d3c9358d73beb3af3dc018b7d116d18655e70ee89f79618d7601ee2"} Dec 03 19:41:26 crc kubenswrapper[4731]: I1203 19:41:26.469218 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:41:26 crc kubenswrapper[4731]: I1203 19:41:26.469855 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:41:56 crc kubenswrapper[4731]: I1203 19:41:56.469225 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:41:56 crc kubenswrapper[4731]: I1203 19:41:56.469823 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.679605 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=57.294203704 podStartE2EDuration="1m45.679581088s" podCreationTimestamp="2025-12-03 19:40:25 +0000 UTC" firstStartedPulling="2025-12-03 19:40:27.137117693 +0000 UTC m=+2747.735712157" lastFinishedPulling="2025-12-03 19:41:15.522495077 +0000 UTC m=+2796.121089541" observedRunningTime="2025-12-03 19:41:17.334818652 +0000 UTC m=+2797.933413126" watchObservedRunningTime="2025-12-03 19:42:10.679581088 +0000 UTC m=+2851.278175552" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.686305 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.688431 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.708868 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.769804 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pz5\" (UniqueName: \"kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.769863 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.770288 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.872492 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pz5\" (UniqueName: \"kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.872584 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.872712 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.873070 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.873421 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:10 crc kubenswrapper[4731]: I1203 19:42:10.896005 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pz5\" (UniqueName: \"kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5\") pod \"redhat-operators-rf47l\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:11 crc kubenswrapper[4731]: I1203 19:42:11.012677 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:11 crc kubenswrapper[4731]: I1203 19:42:11.490181 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:11 crc kubenswrapper[4731]: I1203 19:42:11.822648 4731 generic.go:334] "Generic (PLEG): container finished" podID="00d37e71-8754-4929-89b5-66d28766cff4" containerID="47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44" exitCode=0 Dec 03 19:42:11 crc kubenswrapper[4731]: I1203 19:42:11.822942 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerDied","Data":"47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44"} Dec 03 19:42:11 crc kubenswrapper[4731]: I1203 19:42:11.823022 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerStarted","Data":"9cf2dc114aed977e416e2cf05fba761406bb5e192a72eaf9f1d3f0b3e2b16de0"} Dec 03 19:42:12 crc kubenswrapper[4731]: I1203 19:42:12.835787 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerStarted","Data":"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143"} Dec 03 19:42:14 crc kubenswrapper[4731]: I1203 19:42:14.876444 4731 generic.go:334] "Generic (PLEG): container finished" podID="00d37e71-8754-4929-89b5-66d28766cff4" containerID="ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143" exitCode=0 Dec 03 19:42:14 crc kubenswrapper[4731]: I1203 19:42:14.876676 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerDied","Data":"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143"} Dec 03 19:42:16 crc kubenswrapper[4731]: I1203 19:42:16.896759 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerStarted","Data":"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872"} Dec 03 19:42:16 crc kubenswrapper[4731]: I1203 19:42:16.923068 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rf47l" podStartSLOduration=2.310959775 podStartE2EDuration="6.923047504s" podCreationTimestamp="2025-12-03 19:42:10 +0000 UTC" firstStartedPulling="2025-12-03 19:42:11.824618536 +0000 UTC m=+2852.423213000" lastFinishedPulling="2025-12-03 19:42:16.436706225 +0000 UTC m=+2857.035300729" observedRunningTime="2025-12-03 19:42:16.915844751 +0000 UTC m=+2857.514439225" watchObservedRunningTime="2025-12-03 19:42:16.923047504 +0000 UTC m=+2857.521641968" Dec 03 19:42:21 crc kubenswrapper[4731]: I1203 19:42:21.012847 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:21 crc kubenswrapper[4731]: I1203 19:42:21.014820 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:22 crc kubenswrapper[4731]: I1203 19:42:22.072226 4731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rf47l" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="registry-server" probeResult="failure" output=< Dec 03 19:42:22 crc kubenswrapper[4731]: timeout: failed to connect service ":50051" within 1s Dec 03 19:42:22 crc kubenswrapper[4731]: > Dec 03 19:42:26 crc kubenswrapper[4731]: I1203 19:42:26.468908 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:42:26 crc kubenswrapper[4731]: I1203 19:42:26.469478 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:42:26 crc kubenswrapper[4731]: I1203 19:42:26.469534 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:42:26 crc kubenswrapper[4731]: I1203 19:42:26.470359 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:42:26 crc kubenswrapper[4731]: I1203 19:42:26.470410 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" gracePeriod=600 Dec 03 19:42:26 crc kubenswrapper[4731]: E1203 19:42:26.590602 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:42:27 crc kubenswrapper[4731]: I1203 19:42:27.031080 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" exitCode=0 Dec 03 19:42:27 crc kubenswrapper[4731]: I1203 19:42:27.031473 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48"} Dec 03 19:42:27 crc kubenswrapper[4731]: I1203 19:42:27.031530 4731 scope.go:117] "RemoveContainer" containerID="d1f6ed0bc9ec4bc4c31197505f1ff4ce62ba339b0c8f0b0c888f1a412002cab3" Dec 03 19:42:27 crc kubenswrapper[4731]: I1203 19:42:27.032403 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:42:27 crc kubenswrapper[4731]: E1203 19:42:27.032764 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:42:31 crc kubenswrapper[4731]: I1203 19:42:31.068044 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:31 crc kubenswrapper[4731]: I1203 19:42:31.156730 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:31 crc kubenswrapper[4731]: I1203 19:42:31.317102 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.127064 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rf47l" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="registry-server" containerID="cri-o://2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872" gracePeriod=2 Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.843648 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.969992 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities\") pod \"00d37e71-8754-4929-89b5-66d28766cff4\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.970454 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content\") pod \"00d37e71-8754-4929-89b5-66d28766cff4\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.970534 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pz5\" (UniqueName: \"kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5\") pod \"00d37e71-8754-4929-89b5-66d28766cff4\" (UID: \"00d37e71-8754-4929-89b5-66d28766cff4\") " Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.971350 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities" (OuterVolumeSpecName: "utilities") pod "00d37e71-8754-4929-89b5-66d28766cff4" (UID: "00d37e71-8754-4929-89b5-66d28766cff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.972289 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:42:32 crc kubenswrapper[4731]: I1203 19:42:32.982802 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5" (OuterVolumeSpecName: "kube-api-access-h6pz5") pod "00d37e71-8754-4929-89b5-66d28766cff4" (UID: "00d37e71-8754-4929-89b5-66d28766cff4"). InnerVolumeSpecName "kube-api-access-h6pz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.074633 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pz5\" (UniqueName: \"kubernetes.io/projected/00d37e71-8754-4929-89b5-66d28766cff4-kube-api-access-h6pz5\") on node \"crc\" DevicePath \"\"" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.093298 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00d37e71-8754-4929-89b5-66d28766cff4" (UID: "00d37e71-8754-4929-89b5-66d28766cff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.141226 4731 generic.go:334] "Generic (PLEG): container finished" podID="00d37e71-8754-4929-89b5-66d28766cff4" containerID="2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872" exitCode=0 Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.141343 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerDied","Data":"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872"} Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.141460 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rf47l" event={"ID":"00d37e71-8754-4929-89b5-66d28766cff4","Type":"ContainerDied","Data":"9cf2dc114aed977e416e2cf05fba761406bb5e192a72eaf9f1d3f0b3e2b16de0"} Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.141529 4731 scope.go:117] "RemoveContainer" containerID="2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.142381 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rf47l" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.176703 4731 scope.go:117] "RemoveContainer" containerID="ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.178461 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d37e71-8754-4929-89b5-66d28766cff4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.219540 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.229768 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rf47l"] Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.234939 4731 scope.go:117] "RemoveContainer" containerID="47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.268580 4731 scope.go:117] "RemoveContainer" containerID="2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872" Dec 03 19:42:33 crc kubenswrapper[4731]: E1203 19:42:33.269316 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872\": container with ID starting with 2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872 not found: ID does not exist" containerID="2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.269372 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872"} err="failed to get container status \"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872\": rpc error: code = NotFound desc = could not find container \"2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872\": container with ID starting with 2d4efa455344794890d41e1c07212b6e7c2e8bf8fd5dc9b9e7daff1a27d90872 not found: ID does not exist" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.269407 4731 scope.go:117] "RemoveContainer" containerID="ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143" Dec 03 19:42:33 crc kubenswrapper[4731]: E1203 19:42:33.270049 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143\": container with ID starting with ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143 not found: ID does not exist" containerID="ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.270106 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143"} err="failed to get container status \"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143\": rpc error: code = NotFound desc = could not find container \"ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143\": container with ID starting with ad1a60c74553c444d6b2455570e31f05bc7c84a371fd40d61d62cea301491143 not found: ID does not exist" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.270140 4731 scope.go:117] "RemoveContainer" containerID="47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44" Dec 03 19:42:33 crc kubenswrapper[4731]: E1203 19:42:33.270592 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44\": container with ID starting with 47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44 not found: ID does not exist" containerID="47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.270631 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44"} err="failed to get container status \"47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44\": rpc error: code = NotFound desc = could not find container \"47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44\": container with ID starting with 47124544a4a155ef54d0fdded6b998cc8c1d86cf3c9463df7aa34695ad550c44 not found: ID does not exist" Dec 03 19:42:33 crc kubenswrapper[4731]: I1203 19:42:33.869626 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d37e71-8754-4929-89b5-66d28766cff4" path="/var/lib/kubelet/pods/00d37e71-8754-4929-89b5-66d28766cff4/volumes" Dec 03 19:42:41 crc kubenswrapper[4731]: I1203 19:42:41.856878 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:42:41 crc kubenswrapper[4731]: E1203 19:42:41.857963 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:42:56 crc kubenswrapper[4731]: I1203 19:42:56.856031 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:42:56 crc kubenswrapper[4731]: E1203 19:42:56.857014 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:43:11 crc kubenswrapper[4731]: I1203 19:43:11.857085 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:43:11 crc kubenswrapper[4731]: E1203 19:43:11.858108 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:43:25 crc kubenswrapper[4731]: I1203 19:43:25.856225 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:43:25 crc kubenswrapper[4731]: E1203 19:43:25.856937 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:43:38 crc kubenswrapper[4731]: I1203 19:43:38.856706 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:43:38 crc kubenswrapper[4731]: E1203 19:43:38.857486 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:43:49 crc kubenswrapper[4731]: I1203 19:43:49.864486 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:43:49 crc kubenswrapper[4731]: E1203 19:43:49.865243 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:44:04 crc kubenswrapper[4731]: I1203 19:44:04.857034 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:44:04 crc kubenswrapper[4731]: E1203 19:44:04.858046 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:44:18 crc kubenswrapper[4731]: I1203 19:44:18.856684 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:44:18 crc kubenswrapper[4731]: E1203 19:44:18.857689 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:44:29 crc kubenswrapper[4731]: I1203 19:44:29.862977 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:44:29 crc kubenswrapper[4731]: E1203 19:44:29.863913 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:44:42 crc kubenswrapper[4731]: I1203 19:44:42.855962 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:44:42 crc kubenswrapper[4731]: E1203 19:44:42.856736 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:44:57 crc kubenswrapper[4731]: I1203 19:44:57.856574 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:44:57 crc kubenswrapper[4731]: E1203 19:44:57.857422 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.148578 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm"] Dec 03 19:45:00 crc kubenswrapper[4731]: E1203 19:45:00.149584 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="registry-server" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.149599 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="registry-server" Dec 03 19:45:00 crc kubenswrapper[4731]: E1203 19:45:00.149612 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="extract-utilities" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.149619 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="extract-utilities" Dec 03 19:45:00 crc kubenswrapper[4731]: E1203 19:45:00.149631 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="extract-content" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.149636 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="extract-content" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.149832 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d37e71-8754-4929-89b5-66d28766cff4" containerName="registry-server" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.150591 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.154683 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.155050 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.164672 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm"] Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.301004 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.301466 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjb9\" (UniqueName: \"kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.301549 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.403133 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjb9\" (UniqueName: \"kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.403181 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.403283 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.404247 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.411893 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.420406 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjb9\" (UniqueName: \"kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9\") pod \"collect-profiles-29413185-6fkzm\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.480419 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:00 crc kubenswrapper[4731]: I1203 19:45:00.750650 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm"] Dec 03 19:45:01 crc kubenswrapper[4731]: I1203 19:45:01.661177 4731 generic.go:334] "Generic (PLEG): container finished" podID="7f09bf4a-d60f-4d4a-b73f-f143d576625d" containerID="4acd82e9718aa11750f07c2bc30db817bc5e00cf47d5553ed9918fc1b1723663" exitCode=0 Dec 03 19:45:01 crc kubenswrapper[4731]: I1203 19:45:01.661246 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" event={"ID":"7f09bf4a-d60f-4d4a-b73f-f143d576625d","Type":"ContainerDied","Data":"4acd82e9718aa11750f07c2bc30db817bc5e00cf47d5553ed9918fc1b1723663"} Dec 03 19:45:01 crc kubenswrapper[4731]: I1203 19:45:01.661532 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" event={"ID":"7f09bf4a-d60f-4d4a-b73f-f143d576625d","Type":"ContainerStarted","Data":"d1c271f9c4691e7c7231823917ee334b21a540c6a5d131634b37276502417de9"} Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.059861 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.157436 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxjb9\" (UniqueName: \"kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9\") pod \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.157684 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume\") pod \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.157747 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume\") pod \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\" (UID: \"7f09bf4a-d60f-4d4a-b73f-f143d576625d\") " Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.159038 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume" (OuterVolumeSpecName: "config-volume") pod "7f09bf4a-d60f-4d4a-b73f-f143d576625d" (UID: "7f09bf4a-d60f-4d4a-b73f-f143d576625d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.159573 4731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f09bf4a-d60f-4d4a-b73f-f143d576625d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.165379 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9" (OuterVolumeSpecName: "kube-api-access-cxjb9") pod "7f09bf4a-d60f-4d4a-b73f-f143d576625d" (UID: "7f09bf4a-d60f-4d4a-b73f-f143d576625d"). InnerVolumeSpecName "kube-api-access-cxjb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.165552 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7f09bf4a-d60f-4d4a-b73f-f143d576625d" (UID: "7f09bf4a-d60f-4d4a-b73f-f143d576625d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.262143 4731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7f09bf4a-d60f-4d4a-b73f-f143d576625d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.262200 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxjb9\" (UniqueName: \"kubernetes.io/projected/7f09bf4a-d60f-4d4a-b73f-f143d576625d-kube-api-access-cxjb9\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.681060 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" event={"ID":"7f09bf4a-d60f-4d4a-b73f-f143d576625d","Type":"ContainerDied","Data":"d1c271f9c4691e7c7231823917ee334b21a540c6a5d131634b37276502417de9"} Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.681111 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c271f9c4691e7c7231823917ee334b21a540c6a5d131634b37276502417de9" Dec 03 19:45:03 crc kubenswrapper[4731]: I1203 19:45:03.681198 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413185-6fkzm" Dec 03 19:45:04 crc kubenswrapper[4731]: I1203 19:45:04.144004 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5"] Dec 03 19:45:04 crc kubenswrapper[4731]: I1203 19:45:04.153301 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413140-ln7m5"] Dec 03 19:45:05 crc kubenswrapper[4731]: I1203 19:45:05.868757 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76fea8e6-1b0c-4040-b767-fc7f6205d4ab" path="/var/lib/kubelet/pods/76fea8e6-1b0c-4040-b767-fc7f6205d4ab/volumes" Dec 03 19:45:10 crc kubenswrapper[4731]: I1203 19:45:10.856490 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:45:10 crc kubenswrapper[4731]: E1203 19:45:10.857460 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:45:13 crc kubenswrapper[4731]: I1203 19:45:13.770590 4731 generic.go:334] "Generic (PLEG): container finished" podID="59685196-efa1-464f-b297-b3f23d53e46d" containerID="a52f1a0f1d3c9358d73beb3af3dc018b7d116d18655e70ee89f79618d7601ee2" exitCode=1 Dec 03 19:45:13 crc kubenswrapper[4731]: I1203 19:45:13.770691 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59685196-efa1-464f-b297-b3f23d53e46d","Type":"ContainerDied","Data":"a52f1a0f1d3c9358d73beb3af3dc018b7d116d18655e70ee89f79618d7601ee2"} Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.214136 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314685 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmr7w\" (UniqueName: \"kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314751 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314776 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314866 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314894 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314950 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.314972 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.315550 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.315804 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data" (OuterVolumeSpecName: "config-data") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.315877 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.315999 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"59685196-efa1-464f-b297-b3f23d53e46d\" (UID: \"59685196-efa1-464f-b297-b3f23d53e46d\") " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.318778 4731 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.318816 4731 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.323534 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w" (OuterVolumeSpecName: "kube-api-access-lmr7w") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "kube-api-access-lmr7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.337553 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.339532 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.351181 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.367181 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.367510 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.388138 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "59685196-efa1-464f-b297-b3f23d53e46d" (UID: "59685196-efa1-464f-b297-b3f23d53e46d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427106 4731 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59685196-efa1-464f-b297-b3f23d53e46d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427534 4731 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427663 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmr7w\" (UniqueName: \"kubernetes.io/projected/59685196-efa1-464f-b297-b3f23d53e46d-kube-api-access-lmr7w\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427751 4731 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427838 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.427892 4731 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59685196-efa1-464f-b297-b3f23d53e46d-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.428002 4731 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59685196-efa1-464f-b297-b3f23d53e46d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.458359 4731 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.530007 4731 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.788813 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59685196-efa1-464f-b297-b3f23d53e46d","Type":"ContainerDied","Data":"de547166820c31074a0d567e5c8b7b7801827ad1b392f8c20228a922f2cd6301"} Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.788861 4731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de547166820c31074a0d567e5c8b7b7801827ad1b392f8c20228a922f2cd6301" Dec 03 19:45:15 crc kubenswrapper[4731]: I1203 19:45:15.788865 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.237736 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 19:45:19 crc kubenswrapper[4731]: E1203 19:45:19.238646 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59685196-efa1-464f-b297-b3f23d53e46d" containerName="tempest-tests-tempest-tests-runner" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.238690 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="59685196-efa1-464f-b297-b3f23d53e46d" containerName="tempest-tests-tempest-tests-runner" Dec 03 19:45:19 crc kubenswrapper[4731]: E1203 19:45:19.238715 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f09bf4a-d60f-4d4a-b73f-f143d576625d" containerName="collect-profiles" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.238722 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f09bf4a-d60f-4d4a-b73f-f143d576625d" containerName="collect-profiles" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.239170 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f09bf4a-d60f-4d4a-b73f-f143d576625d" containerName="collect-profiles" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.239203 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="59685196-efa1-464f-b297-b3f23d53e46d" containerName="tempest-tests-tempest-tests-runner" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.239905 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.243178 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ldvz9" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.245746 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.403429 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p86v\" (UniqueName: \"kubernetes.io/projected/1510b7d5-dadc-4cf0-9f68-cd20534973fa-kube-api-access-7p86v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.404162 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.505787 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p86v\" (UniqueName: \"kubernetes.io/projected/1510b7d5-dadc-4cf0-9f68-cd20534973fa-kube-api-access-7p86v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.505857 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.506501 4731 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.529829 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p86v\" (UniqueName: \"kubernetes.io/projected/1510b7d5-dadc-4cf0-9f68-cd20534973fa-kube-api-access-7p86v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.535964 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1510b7d5-dadc-4cf0-9f68-cd20534973fa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:19 crc kubenswrapper[4731]: I1203 19:45:19.574227 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 19:45:20 crc kubenswrapper[4731]: I1203 19:45:20.194385 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 19:45:20 crc kubenswrapper[4731]: I1203 19:45:20.201727 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:45:20 crc kubenswrapper[4731]: I1203 19:45:20.808364 4731 scope.go:117] "RemoveContainer" containerID="eeebf7b4a970e04e0250d523e60ace8de9dbd7afcdb2437bf4a2f045e760256d" Dec 03 19:45:20 crc kubenswrapper[4731]: I1203 19:45:20.872274 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1510b7d5-dadc-4cf0-9f68-cd20534973fa","Type":"ContainerStarted","Data":"524cb88a54b6fd5791956e0278c46ffafaf6aa09794496fa25784ff3cfede1e5"} Dec 03 19:45:21 crc kubenswrapper[4731]: I1203 19:45:21.883757 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1510b7d5-dadc-4cf0-9f68-cd20534973fa","Type":"ContainerStarted","Data":"87710312861d5db2296acbb344469d1c75a89420f17626a504fac4ff2fa8b9b7"} Dec 03 19:45:21 crc kubenswrapper[4731]: I1203 19:45:21.904824 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.741766857 podStartE2EDuration="2.904800642s" podCreationTimestamp="2025-12-03 19:45:19 +0000 UTC" firstStartedPulling="2025-12-03 19:45:20.20151353 +0000 UTC m=+3040.800107994" lastFinishedPulling="2025-12-03 19:45:21.364547295 +0000 UTC m=+3041.963141779" observedRunningTime="2025-12-03 19:45:21.900749227 +0000 UTC m=+3042.499343691" watchObservedRunningTime="2025-12-03 19:45:21.904800642 +0000 UTC m=+3042.503395116" Dec 03 19:45:23 crc kubenswrapper[4731]: I1203 19:45:23.856178 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:45:23 crc kubenswrapper[4731]: E1203 19:45:23.856801 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:45:35 crc kubenswrapper[4731]: I1203 19:45:35.856960 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:45:35 crc kubenswrapper[4731]: E1203 19:45:35.858490 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:45:48 crc kubenswrapper[4731]: I1203 19:45:48.856746 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:45:48 crc kubenswrapper[4731]: E1203 19:45:48.857631 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:46:02 crc kubenswrapper[4731]: I1203 19:46:02.855898 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:46:02 crc kubenswrapper[4731]: E1203 19:46:02.856598 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.470327 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k4sxz/must-gather-sfszb"] Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.472718 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.478606 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k4sxz"/"kube-root-ca.crt" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.478661 4731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k4sxz"/"openshift-service-ca.crt" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.478815 4731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k4sxz"/"default-dockercfg-2xv55" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.492321 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k4sxz/must-gather-sfszb"] Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.624539 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.624998 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2jm\" (UniqueName: \"kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.727644 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.727720 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2jm\" (UniqueName: \"kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.728676 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.749177 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2jm\" (UniqueName: \"kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm\") pod \"must-gather-sfszb\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:09 crc kubenswrapper[4731]: I1203 19:46:09.798712 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:46:10 crc kubenswrapper[4731]: I1203 19:46:10.371819 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k4sxz/must-gather-sfszb"] Dec 03 19:46:10 crc kubenswrapper[4731]: I1203 19:46:10.416545 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/must-gather-sfszb" event={"ID":"e9c78747-48e2-4e90-a7ad-4c624da161ad","Type":"ContainerStarted","Data":"460345aa3a4c655fc7315424f42f1bdcf091ac251159e1f500837ac81874d8c7"} Dec 03 19:46:14 crc kubenswrapper[4731]: I1203 19:46:14.456237 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/must-gather-sfszb" event={"ID":"e9c78747-48e2-4e90-a7ad-4c624da161ad","Type":"ContainerStarted","Data":"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d"} Dec 03 19:46:14 crc kubenswrapper[4731]: I1203 19:46:14.856429 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:46:14 crc kubenswrapper[4731]: E1203 19:46:14.856715 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:46:15 crc kubenswrapper[4731]: I1203 19:46:15.467241 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/must-gather-sfszb" event={"ID":"e9c78747-48e2-4e90-a7ad-4c624da161ad","Type":"ContainerStarted","Data":"12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f"} Dec 03 19:46:15 crc kubenswrapper[4731]: I1203 19:46:15.485729 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k4sxz/must-gather-sfszb" podStartSLOduration=2.837580771 podStartE2EDuration="6.485709627s" podCreationTimestamp="2025-12-03 19:46:09 +0000 UTC" firstStartedPulling="2025-12-03 19:46:10.374019699 +0000 UTC m=+3090.972614163" lastFinishedPulling="2025-12-03 19:46:14.022148555 +0000 UTC m=+3094.620743019" observedRunningTime="2025-12-03 19:46:15.484874461 +0000 UTC m=+3096.083468925" watchObservedRunningTime="2025-12-03 19:46:15.485709627 +0000 UTC m=+3096.084304091" Dec 03 19:46:17 crc kubenswrapper[4731]: I1203 19:46:17.986294 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-8pkkk"] Dec 03 19:46:17 crc kubenswrapper[4731]: I1203 19:46:17.992716 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.019400 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mplmb\" (UniqueName: \"kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.019794 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.121716 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mplmb\" (UniqueName: \"kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.121761 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.121970 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.144810 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mplmb\" (UniqueName: \"kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb\") pod \"crc-debug-8pkkk\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.335174 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:18 crc kubenswrapper[4731]: I1203 19:46:18.514963 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" event={"ID":"ca816e7b-e0ec-4565-8362-ba516754a29b","Type":"ContainerStarted","Data":"fcc2ddfd4965814ac0597b2b98e906eb85d0b454331db1329445853df57b0c85"} Dec 03 19:46:28 crc kubenswrapper[4731]: I1203 19:46:28.856407 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:46:28 crc kubenswrapper[4731]: E1203 19:46:28.858296 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:46:31 crc kubenswrapper[4731]: I1203 19:46:31.664171 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" event={"ID":"ca816e7b-e0ec-4565-8362-ba516754a29b","Type":"ContainerStarted","Data":"dd18c329b1fb5272a2eb116d892322a1d5e1f0fd3da20177c5d31d6e6964c12c"} Dec 03 19:46:31 crc kubenswrapper[4731]: I1203 19:46:31.682884 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" podStartSLOduration=2.43592161 podStartE2EDuration="14.682863755s" podCreationTimestamp="2025-12-03 19:46:17 +0000 UTC" firstStartedPulling="2025-12-03 19:46:18.385223363 +0000 UTC m=+3098.983817827" lastFinishedPulling="2025-12-03 19:46:30.632165508 +0000 UTC m=+3111.230759972" observedRunningTime="2025-12-03 19:46:31.681455981 +0000 UTC m=+3112.280050445" watchObservedRunningTime="2025-12-03 19:46:31.682863755 +0000 UTC m=+3112.281458229" Dec 03 19:46:41 crc kubenswrapper[4731]: I1203 19:46:41.857175 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:46:41 crc kubenswrapper[4731]: E1203 19:46:41.858108 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:46:47 crc kubenswrapper[4731]: I1203 19:46:47.822622 4731 generic.go:334] "Generic (PLEG): container finished" podID="ca816e7b-e0ec-4565-8362-ba516754a29b" containerID="dd18c329b1fb5272a2eb116d892322a1d5e1f0fd3da20177c5d31d6e6964c12c" exitCode=0 Dec 03 19:46:47 crc kubenswrapper[4731]: I1203 19:46:47.822685 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" event={"ID":"ca816e7b-e0ec-4565-8362-ba516754a29b","Type":"ContainerDied","Data":"dd18c329b1fb5272a2eb116d892322a1d5e1f0fd3da20177c5d31d6e6964c12c"} Dec 03 19:46:48 crc kubenswrapper[4731]: I1203 19:46:48.957927 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:48 crc kubenswrapper[4731]: I1203 19:46:48.993311 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-8pkkk"] Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.001729 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-8pkkk"] Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.117068 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host\") pod \"ca816e7b-e0ec-4565-8362-ba516754a29b\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.117203 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mplmb\" (UniqueName: \"kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb\") pod \"ca816e7b-e0ec-4565-8362-ba516754a29b\" (UID: \"ca816e7b-e0ec-4565-8362-ba516754a29b\") " Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.117197 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host" (OuterVolumeSpecName: "host") pod "ca816e7b-e0ec-4565-8362-ba516754a29b" (UID: "ca816e7b-e0ec-4565-8362-ba516754a29b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.118039 4731 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca816e7b-e0ec-4565-8362-ba516754a29b-host\") on node \"crc\" DevicePath \"\"" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.127830 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb" (OuterVolumeSpecName: "kube-api-access-mplmb") pod "ca816e7b-e0ec-4565-8362-ba516754a29b" (UID: "ca816e7b-e0ec-4565-8362-ba516754a29b"). InnerVolumeSpecName "kube-api-access-mplmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.220291 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mplmb\" (UniqueName: \"kubernetes.io/projected/ca816e7b-e0ec-4565-8362-ba516754a29b-kube-api-access-mplmb\") on node \"crc\" DevicePath \"\"" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.875772 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-8pkkk" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.877055 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca816e7b-e0ec-4565-8362-ba516754a29b" path="/var/lib/kubelet/pods/ca816e7b-e0ec-4565-8362-ba516754a29b/volumes" Dec 03 19:46:49 crc kubenswrapper[4731]: I1203 19:46:49.877863 4731 scope.go:117] "RemoveContainer" containerID="dd18c329b1fb5272a2eb116d892322a1d5e1f0fd3da20177c5d31d6e6964c12c" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.281027 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-5z5vx"] Dec 03 19:46:50 crc kubenswrapper[4731]: E1203 19:46:50.281472 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca816e7b-e0ec-4565-8362-ba516754a29b" containerName="container-00" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.281486 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca816e7b-e0ec-4565-8362-ba516754a29b" containerName="container-00" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.281696 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca816e7b-e0ec-4565-8362-ba516754a29b" containerName="container-00" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.282352 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.445192 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfhlh\" (UniqueName: \"kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.445440 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.547714 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.547807 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfhlh\" (UniqueName: \"kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.547896 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.565460 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfhlh\" (UniqueName: \"kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh\") pod \"crc-debug-5z5vx\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.598851 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:50 crc kubenswrapper[4731]: I1203 19:46:50.872556 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" event={"ID":"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0","Type":"ContainerStarted","Data":"0f2d30ee81f2e2995176563fcf5404193224e2ff03b8e3e4e0515ffad14b65a0"} Dec 03 19:46:51 crc kubenswrapper[4731]: I1203 19:46:51.899339 4731 generic.go:334] "Generic (PLEG): container finished" podID="c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" containerID="8ca9c530b984a1befb9b5534c1b7b59d7568782975d1b7390091159f004b9b7a" exitCode=1 Dec 03 19:46:51 crc kubenswrapper[4731]: I1203 19:46:51.899712 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" event={"ID":"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0","Type":"ContainerDied","Data":"8ca9c530b984a1befb9b5534c1b7b59d7568782975d1b7390091159f004b9b7a"} Dec 03 19:46:51 crc kubenswrapper[4731]: I1203 19:46:51.960418 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-5z5vx"] Dec 03 19:46:51 crc kubenswrapper[4731]: I1203 19:46:51.974750 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k4sxz/crc-debug-5z5vx"] Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.018804 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.103557 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfhlh\" (UniqueName: \"kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh\") pod \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.103662 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host\") pod \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\" (UID: \"c9c900fa-d2f7-4fc2-a307-a790a2cf08f0\") " Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.104389 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host" (OuterVolumeSpecName: "host") pod "c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" (UID: "c9c900fa-d2f7-4fc2-a307-a790a2cf08f0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.117606 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh" (OuterVolumeSpecName: "kube-api-access-cfhlh") pod "c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" (UID: "c9c900fa-d2f7-4fc2-a307-a790a2cf08f0"). InnerVolumeSpecName "kube-api-access-cfhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.206563 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfhlh\" (UniqueName: \"kubernetes.io/projected/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-kube-api-access-cfhlh\") on node \"crc\" DevicePath \"\"" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.206609 4731 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0-host\") on node \"crc\" DevicePath \"\"" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.873715 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" path="/var/lib/kubelet/pods/c9c900fa-d2f7-4fc2-a307-a790a2cf08f0/volumes" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.919705 4731 scope.go:117] "RemoveContainer" containerID="8ca9c530b984a1befb9b5534c1b7b59d7568782975d1b7390091159f004b9b7a" Dec 03 19:46:53 crc kubenswrapper[4731]: I1203 19:46:53.919789 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/crc-debug-5z5vx" Dec 03 19:46:54 crc kubenswrapper[4731]: I1203 19:46:54.857080 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:46:54 crc kubenswrapper[4731]: E1203 19:46:54.857721 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:47:08 crc kubenswrapper[4731]: I1203 19:47:08.855736 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:47:08 crc kubenswrapper[4731]: E1203 19:47:08.856513 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:47:20 crc kubenswrapper[4731]: I1203 19:47:20.856916 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:47:20 crc kubenswrapper[4731]: E1203 19:47:20.857916 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:47:24 crc kubenswrapper[4731]: I1203 19:47:24.673586 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55489cbdc4-8kvp2_60084d2a-1621-420c-80ac-fb38a0eae005/barbican-api/0.log" Dec 03 19:47:24 crc kubenswrapper[4731]: I1203 19:47:24.820478 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55489cbdc4-8kvp2_60084d2a-1621-420c-80ac-fb38a0eae005/barbican-api-log/0.log" Dec 03 19:47:24 crc kubenswrapper[4731]: I1203 19:47:24.938780 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b8bcd5f6-jzp2r_895114ae-e1ff-4386-8d43-1c7a2a9f2867/barbican-keystone-listener/0.log" Dec 03 19:47:24 crc kubenswrapper[4731]: I1203 19:47:24.965163 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b8bcd5f6-jzp2r_895114ae-e1ff-4386-8d43-1c7a2a9f2867/barbican-keystone-listener-log/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.120242 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cfb76fcc-q9qnm_2757b374-1ad9-404d-89a6-a033996ac07c/barbican-worker/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.157608 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cfb76fcc-q9qnm_2757b374-1ad9-404d-89a6-a033996ac07c/barbican-worker-log/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.314763 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mv9jw_079e5870-590d-4617-b9de-acdae5e59284/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.429269 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6948a0b4-f3c6-483a-b304-8c5cefee3e31/ceilometer-central-agent/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.481107 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6948a0b4-f3c6-483a-b304-8c5cefee3e31/ceilometer-notification-agent/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.540056 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6948a0b4-f3c6-483a-b304-8c5cefee3e31/proxy-httpd/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.567248 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6948a0b4-f3c6-483a-b304-8c5cefee3e31/sg-core/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.738189 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_00285379-3e6d-4b1c-9de5-d08bacd73c79/cinder-api/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.778297 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_00285379-3e6d-4b1c-9de5-d08bacd73c79/cinder-api-log/0.log" Dec 03 19:47:25 crc kubenswrapper[4731]: I1203 19:47:25.900583 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d8d6f074-0653-495e-8e09-b0cbc71e7e0a/cinder-scheduler/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.018936 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d8d6f074-0653-495e-8e09-b0cbc71e7e0a/probe/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.127812 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-gdbjj_d668caaa-0ba4-4cbe-8fce-8154cf9b8b26/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.266222 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2826p_1bdcc986-86f8-47a5-9856-b3e0969e9d29/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.346763 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f469589c7-hz6wc_2995333d-35da-4bd5-a503-e998d4311219/init/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.518941 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f469589c7-hz6wc_2995333d-35da-4bd5-a503-e998d4311219/init/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.595970 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-d9hgw_e27c61a5-7955-4d79-81a5-6f12322579c0/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.627924 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-f469589c7-hz6wc_2995333d-35da-4bd5-a503-e998d4311219/dnsmasq-dns/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.826381 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_74f83d4f-46da-43e3-9fab-56e0e45dd76d/glance-httpd/0.log" Dec 03 19:47:26 crc kubenswrapper[4731]: I1203 19:47:26.863546 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_74f83d4f-46da-43e3-9fab-56e0e45dd76d/glance-log/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.034381 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_336bb4e4-deca-445b-af0f-2df6ea097a14/glance-log/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.057127 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_336bb4e4-deca-445b-af0f-2df6ea097a14/glance-httpd/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.251955 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7578458cb-br8st_5a524772-1481-4781-9847-f3394664a2d3/horizon/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.413434 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v5qt6_500db9f6-205e-4e09-a8a4-f0e1bf42e867/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.527539 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7578458cb-br8st_5a524772-1481-4781-9847-f3394664a2d3/horizon-log/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.687345 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-sfjq4_d77a7c18-1300-42ea-8c28-910dcea576ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.815187 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-787778d6bb-5glsd_87313be0-fc9a-4f6b-a50c-1c2adc167dad/keystone-api/0.log" Dec 03 19:47:27 crc kubenswrapper[4731]: I1203 19:47:27.941714 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4b88a816-d7c7-4608-b062-f8b9432af359/kube-state-metrics/0.log" Dec 03 19:47:28 crc kubenswrapper[4731]: I1203 19:47:28.056527 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vg8h5_63e88ef4-d82b-4798-b386-8158184b32d4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:28 crc kubenswrapper[4731]: I1203 19:47:28.352899 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748fc65857-2r69r_74e9f605-1edd-4f8e-af56-133041b4c068/neutron-httpd/0.log" Dec 03 19:47:28 crc kubenswrapper[4731]: I1203 19:47:28.389375 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748fc65857-2r69r_74e9f605-1edd-4f8e-af56-133041b4c068/neutron-api/0.log" Dec 03 19:47:28 crc kubenswrapper[4731]: I1203 19:47:28.640826 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kv876_62af9712-5ba8-42f1-ba62-6b6de75e0de6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.126687 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2abf2726-a215-4e18-a1f8-a05cdda42b69/nova-api-log/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.202691 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2abf2726-a215-4e18-a1f8-a05cdda42b69/nova-api-api/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.271302 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_0080e314-2b1a-4194-b82c-84d5b3d4f1a8/nova-cell0-conductor-conductor/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.491896 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_36c34f8f-b818-4241-a974-316d98a4eaca/nova-cell1-conductor-conductor/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.589868 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_170f66c2-939d-42f6-af1e-f28c9cc92a71/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.730840 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bdl5z_dac4ab43-2ebd-4e3b-a87d-f18c4f9147ac/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:29 crc kubenswrapper[4731]: I1203 19:47:29.832133 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c46ad6da-321d-4263-a203-8d9f47b1ab43/nova-metadata-log/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.130076 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fe568c47-889f-4487-a5e1-9cd479fd0145/nova-scheduler-scheduler/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.253733 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d340d800-c6f0-4375-81ed-d993a19950dd/mysql-bootstrap/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.408660 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d340d800-c6f0-4375-81ed-d993a19950dd/galera/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.430249 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d340d800-c6f0-4375-81ed-d993a19950dd/mysql-bootstrap/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.622100 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d74ba4a-c904-441f-871b-57c691c528e2/mysql-bootstrap/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.816326 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d74ba4a-c904-441f-871b-57c691c528e2/mysql-bootstrap/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.827773 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d74ba4a-c904-441f-871b-57c691c528e2/galera/0.log" Dec 03 19:47:30 crc kubenswrapper[4731]: I1203 19:47:30.867202 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c46ad6da-321d-4263-a203-8d9f47b1ab43/nova-metadata-metadata/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.005913 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6e513fe3-52a0-403d-a6d5-b76e905e55e0/openstackclient/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.110641 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r76nc_077da0f9-1e75-450b-b6b1-921c8ff9950b/openstack-network-exporter/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.263119 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rrp7_c42cb210-65c5-43db-84de-7ecf24807aab/ovsdb-server-init/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.452527 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rrp7_c42cb210-65c5-43db-84de-7ecf24807aab/ovsdb-server/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.481938 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rrp7_c42cb210-65c5-43db-84de-7ecf24807aab/ovs-vswitchd/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.540564 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rrp7_c42cb210-65c5-43db-84de-7ecf24807aab/ovsdb-server-init/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.659720 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s8zdm_d8f5bdad-d0e4-418c-b6f0-0c1c6f0da2c3/ovn-controller/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.806980 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6thtx_732d4254-41b9-4098-87cd-223787bf455e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:31 crc kubenswrapper[4731]: I1203 19:47:31.923226 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_71768265-a5dd-4890-b3ff-349f6a1114fc/openstack-network-exporter/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.043076 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_71768265-a5dd-4890-b3ff-349f6a1114fc/ovn-northd/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.131467 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d2a6487b-2b25-4b00-a4c3-4c11caa0da2b/ovsdbserver-nb/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.180563 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d2a6487b-2b25-4b00-a4c3-4c11caa0da2b/openstack-network-exporter/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.384950 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d88aab08-1249-4391-b0e2-ec8f0704e7c3/openstack-network-exporter/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.464621 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d88aab08-1249-4391-b0e2-ec8f0704e7c3/ovsdbserver-sb/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.532766 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f7d595d8d-wt4nd_9981e783-5ae8-488c-85ec-2b06327f324c/placement-api/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.654771 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f7d595d8d-wt4nd_9981e783-5ae8-488c-85ec-2b06327f324c/placement-log/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.677205 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f5c78742-a693-4329-956e-96662dfcb374/setup-container/0.log" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.856029 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:47:32 crc kubenswrapper[4731]: I1203 19:47:32.992665 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f5c78742-a693-4329-956e-96662dfcb374/rabbitmq/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.006355 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f5c78742-a693-4329-956e-96662dfcb374/setup-container/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.030358 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2baac998-a8f6-4902-a641-5b9229c9dd2f/setup-container/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.301682 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a"} Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.519558 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2baac998-a8f6-4902-a641-5b9229c9dd2f/rabbitmq/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.527717 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2baac998-a8f6-4902-a641-5b9229c9dd2f/setup-container/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.586097 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6pjr6_3c5a2880-9f98-4d8c-ac95-a9b784692c44/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.713533 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ptwmj_d910b616-8396-482a-8fd9-976f3b1ac4a0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:33 crc kubenswrapper[4731]: I1203 19:47:33.823292 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wfbhb_3384f02c-5b8e-4711-9595-0c62bb7fe7d4/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.024171 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rlcd9_65e0409b-7c26-42bc-9543-f82d0d6a1d5d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.106865 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rsrfq_2ec83323-6f0b-4824-aff6-c68a5b5628cd/ssh-known-hosts-edpm-deployment/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.348176 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cdc55748f-txbzr_1326b67b-b6eb-476f-957b-3f6f2ba94ec5/proxy-server/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.439383 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5cdc55748f-txbzr_1326b67b-b6eb-476f-957b-3f6f2ba94ec5/proxy-httpd/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.458509 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6vv75_6c1b1914-8cc7-4d1b-9fbb-0cdaf78a5285/swift-ring-rebalance/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.624375 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/account-auditor/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.672555 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/account-reaper/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.726477 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/account-replicator/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.874049 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/container-auditor/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.879414 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/account-server/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.909720 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/container-replicator/0.log" Dec 03 19:47:34 crc kubenswrapper[4731]: I1203 19:47:34.927299 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/container-server/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.063794 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/container-updater/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.104857 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/object-auditor/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.157227 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/object-expirer/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.225676 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/object-replicator/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.247426 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/object-server/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.322500 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/object-updater/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.423148 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/rsync/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.468403 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8327126f-a2f3-4b2d-a5b3-118bfa1f41ce/swift-recon-cron/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.549476 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wsklj_b0b0f663-169b-4270-be40-1b2dfab89560/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.842619 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1510b7d5-dadc-4cf0-9f68-cd20534973fa/test-operator-logs-container/0.log" Dec 03 19:47:35 crc kubenswrapper[4731]: I1203 19:47:35.901364 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_59685196-efa1-464f-b297-b3f23d53e46d/tempest-tests-tempest-tests-runner/0.log" Dec 03 19:47:36 crc kubenswrapper[4731]: I1203 19:47:36.038946 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-69k6k_1728626c-786d-4913-9501-a8286b12f474/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 19:47:47 crc kubenswrapper[4731]: I1203 19:47:47.297425 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_286db73c-ad17-4f3b-aeb8-d8423872a2a1/memcached/0.log" Dec 03 19:48:02 crc kubenswrapper[4731]: I1203 19:48:02.520103 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/util/0.log" Dec 03 19:48:02 crc kubenswrapper[4731]: I1203 19:48:02.737533 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/pull/0.log" Dec 03 19:48:02 crc kubenswrapper[4731]: I1203 19:48:02.738062 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/util/0.log" Dec 03 19:48:02 crc kubenswrapper[4731]: I1203 19:48:02.791615 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/pull/0.log" Dec 03 19:48:02 crc kubenswrapper[4731]: I1203 19:48:02.985871 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/util/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.011699 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/pull/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.026542 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a408b4a84e94d16bade86c7ee7070e0e31dd35adc30cce9bea9470f6419vwlz_22d0de6c-28c2-44d2-8aef-c5680cb681aa/extract/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.206476 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rlbmk_f6d78107-9c18-4cca-afa8-a360f45c6bac/kube-rbac-proxy/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.283478 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rlbmk_f6d78107-9c18-4cca-afa8-a360f45c6bac/manager/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.286510 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2zlnb_362b0743-3165-454e-93b9-b6713d26680b/kube-rbac-proxy/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.464718 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2zlnb_362b0743-3165-454e-93b9-b6713d26680b/manager/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.497504 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-2vmhb_d8e13eee-1041-4fb9-b8d1-6169c42d5de3/kube-rbac-proxy/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.547038 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-2vmhb_d8e13eee-1041-4fb9-b8d1-6169c42d5de3/manager/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.704882 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-bqv9t_7f50e5ed-f1dc-4706-8a7c-0d7e85a06351/kube-rbac-proxy/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.797128 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-bqv9t_7f50e5ed-f1dc-4706-8a7c-0d7e85a06351/manager/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.929163 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-w54rz_cf89e5f1-3460-42f6-b66f-6a556118cd30/manager/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.943916 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-w54rz_cf89e5f1-3460-42f6-b66f-6a556118cd30/kube-rbac-proxy/0.log" Dec 03 19:48:03 crc kubenswrapper[4731]: I1203 19:48:03.981846 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d22cz_1c371c8f-7ede-4286-9d7c-6f65f1323237/kube-rbac-proxy/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.146643 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d22cz_1c371c8f-7ede-4286-9d7c-6f65f1323237/manager/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.169205 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-777cfc666b-wx49m_8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5/kube-rbac-proxy/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.416952 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rpprb_b16dc4e0-2027-45dd-bee8-c5c5346e13f5/kube-rbac-proxy/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.445921 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-777cfc666b-wx49m_8bb7d36f-4cb9-4300-9dcc-ce0e6a1eb7f5/manager/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.456074 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rpprb_b16dc4e0-2027-45dd-bee8-c5c5346e13f5/manager/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.658348 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-rns58_dd42edc8-cfba-4913-8d87-7860ecef904f/kube-rbac-proxy/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.724871 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-rns58_dd42edc8-cfba-4913-8d87-7860ecef904f/manager/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.815766 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-h2ldk_c714457d-536a-48fe-8df4-758cff8fb22d/kube-rbac-proxy/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.859771 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-h2ldk_c714457d-536a-48fe-8df4-758cff8fb22d/manager/0.log" Dec 03 19:48:04 crc kubenswrapper[4731]: I1203 19:48:04.913784 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ncb9l_3fe917e9-4872-4eb1-9bc4-744c81813123/kube-rbac-proxy/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.042053 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ncb9l_3fe917e9-4872-4eb1-9bc4-744c81813123/manager/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.220110 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-54m7x_6085f8d0-d279-4997-a86a-3e539495c9d0/manager/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.220379 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-54m7x_6085f8d0-d279-4997-a86a-3e539495c9d0/kube-rbac-proxy/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.362611 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-85q9h_37985ade-8410-4ecb-af0c-4d7bdd40608a/kube-rbac-proxy/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.491055 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-ddmb5_2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f/kube-rbac-proxy/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.535113 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-85q9h_37985ade-8410-4ecb-af0c-4d7bdd40608a/manager/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.645674 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-ddmb5_2d3f2bd8-93ab-437c-9bb1-dc6d06ae590f/manager/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.720833 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq_e625ea8d-55cc-4749-80f6-2848e064a6bf/kube-rbac-proxy/0.log" Dec 03 19:48:05 crc kubenswrapper[4731]: I1203 19:48:05.750959 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4pljbq_e625ea8d-55cc-4749-80f6-2848e064a6bf/manager/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.178397 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5984d69b9f-x9b92_799e7213-a183-43a9-9d26-ff500765cdeb/operator/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.198868 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-x5lwc_b7d47a7b-9fb4-4431-99bc-fac95d81a4cf/registry-server/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.458226 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-z74rk_2efe3c0e-8643-45f4-920e-17aa65157644/kube-rbac-proxy/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.524824 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-z74rk_2efe3c0e-8643-45f4-920e-17aa65157644/manager/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.648495 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-9b2cf_2b4f79c3-66c0-4c91-bfd1-bef243806900/kube-rbac-proxy/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.752079 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-9b2cf_2b4f79c3-66c0-4c91-bfd1-bef243806900/manager/0.log" Dec 03 19:48:06 crc kubenswrapper[4731]: I1203 19:48:06.928647 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gwz5z_ce8b3502-d3bc-463d-b3a4-a758b6b42acb/operator/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.021095 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-zlg7s_23b6d2e9-27bc-4944-ba47-a592981fa0d3/kube-rbac-proxy/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.132796 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5d8f48999-r8rg8_771782b8-3be4-499f-96b9-5b862de7f654/manager/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.137079 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-zlg7s_23b6d2e9-27bc-4944-ba47-a592981fa0d3/manager/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.262210 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-qsgsq_e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3/kube-rbac-proxy/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.286939 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-qsgsq_e63db4fb-b3e3-45bb-bd2d-d5af9574d8a3/manager/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.345560 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-428v7_699771e7-b57b-4435-95d9-964c90bbcc3f/kube-rbac-proxy/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.432207 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-428v7_699771e7-b57b-4435-95d9-964c90bbcc3f/manager/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.564613 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-gt4xb_36852547-6431-45bc-b56e-6f8261334da2/kube-rbac-proxy/0.log" Dec 03 19:48:07 crc kubenswrapper[4731]: I1203 19:48:07.669473 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-gt4xb_36852547-6431-45bc-b56e-6f8261334da2/manager/0.log" Dec 03 19:48:26 crc kubenswrapper[4731]: I1203 19:48:26.308072 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k7gjd_f0b43134-d5ba-4f4b-bd4b-e5a838d23b18/control-plane-machine-set-operator/0.log" Dec 03 19:48:26 crc kubenswrapper[4731]: I1203 19:48:26.483897 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vvkjw_a3a43226-6c7b-43bf-a154-093348017ac8/machine-api-operator/0.log" Dec 03 19:48:26 crc kubenswrapper[4731]: I1203 19:48:26.491018 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vvkjw_a3a43226-6c7b-43bf-a154-093348017ac8/kube-rbac-proxy/0.log" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.137906 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:32 crc kubenswrapper[4731]: E1203 19:48:32.138858 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" containerName="container-00" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.138873 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" containerName="container-00" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.139114 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c900fa-d2f7-4fc2-a307-a790a2cf08f0" containerName="container-00" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.144734 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.160081 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.264796 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8x4c\" (UniqueName: \"kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.264882 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.265432 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.335122 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.339955 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.354602 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402279 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402352 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402400 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402427 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncfpd\" (UniqueName: \"kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402465 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8x4c\" (UniqueName: \"kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402494 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.402956 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.403189 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.445409 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8x4c\" (UniqueName: \"kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c\") pod \"community-operators-zrgcb\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.477836 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.506749 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.506805 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncfpd\" (UniqueName: \"kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.506954 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.507512 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.507756 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.544233 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncfpd\" (UniqueName: \"kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd\") pod \"redhat-marketplace-6264h\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:32 crc kubenswrapper[4731]: I1203 19:48:32.736318 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.113560 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.311039 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:33 crc kubenswrapper[4731]: W1203 19:48:33.315367 4731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef49f983_af7d_46ea_8196_109683380e89.slice/crio-2c2ff7f4cd5cde9dc4e30ff2162f8c1ad79f146d0817bac066d2b3f2a19bc3a0 WatchSource:0}: Error finding container 2c2ff7f4cd5cde9dc4e30ff2162f8c1ad79f146d0817bac066d2b3f2a19bc3a0: Status 404 returned error can't find the container with id 2c2ff7f4cd5cde9dc4e30ff2162f8c1ad79f146d0817bac066d2b3f2a19bc3a0 Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.907756 4731 generic.go:334] "Generic (PLEG): container finished" podID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerID="69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036" exitCode=0 Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.907856 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerDied","Data":"69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036"} Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.908612 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerStarted","Data":"4d7bf741e216328e704cf830fabdd7a628dcc083907186c509963c23d4c34262"} Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.910051 4731 generic.go:334] "Generic (PLEG): container finished" podID="ef49f983-af7d-46ea-8196-109683380e89" containerID="0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a" exitCode=0 Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.910102 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerDied","Data":"0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a"} Dec 03 19:48:33 crc kubenswrapper[4731]: I1203 19:48:33.910140 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerStarted","Data":"2c2ff7f4cd5cde9dc4e30ff2162f8c1ad79f146d0817bac066d2b3f2a19bc3a0"} Dec 03 19:48:34 crc kubenswrapper[4731]: I1203 19:48:34.922515 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerStarted","Data":"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3"} Dec 03 19:48:34 crc kubenswrapper[4731]: I1203 19:48:34.925582 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerStarted","Data":"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c"} Dec 03 19:48:35 crc kubenswrapper[4731]: I1203 19:48:35.940139 4731 generic.go:334] "Generic (PLEG): container finished" podID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerID="a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3" exitCode=0 Dec 03 19:48:35 crc kubenswrapper[4731]: I1203 19:48:35.940212 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerDied","Data":"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3"} Dec 03 19:48:35 crc kubenswrapper[4731]: I1203 19:48:35.944081 4731 generic.go:334] "Generic (PLEG): container finished" podID="ef49f983-af7d-46ea-8196-109683380e89" containerID="c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c" exitCode=0 Dec 03 19:48:35 crc kubenswrapper[4731]: I1203 19:48:35.944145 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerDied","Data":"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c"} Dec 03 19:48:36 crc kubenswrapper[4731]: I1203 19:48:36.955111 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerStarted","Data":"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7"} Dec 03 19:48:36 crc kubenswrapper[4731]: I1203 19:48:36.957641 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerStarted","Data":"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa"} Dec 03 19:48:36 crc kubenswrapper[4731]: I1203 19:48:36.980165 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrgcb" podStartSLOduration=2.552130703 podStartE2EDuration="4.980140149s" podCreationTimestamp="2025-12-03 19:48:32 +0000 UTC" firstStartedPulling="2025-12-03 19:48:33.910071183 +0000 UTC m=+3234.508665647" lastFinishedPulling="2025-12-03 19:48:36.338080639 +0000 UTC m=+3236.936675093" observedRunningTime="2025-12-03 19:48:36.974152315 +0000 UTC m=+3237.572746779" watchObservedRunningTime="2025-12-03 19:48:36.980140149 +0000 UTC m=+3237.578734613" Dec 03 19:48:37 crc kubenswrapper[4731]: I1203 19:48:37.001572 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6264h" podStartSLOduration=2.446295457 podStartE2EDuration="5.001550538s" podCreationTimestamp="2025-12-03 19:48:32 +0000 UTC" firstStartedPulling="2025-12-03 19:48:33.91156343 +0000 UTC m=+3234.510157894" lastFinishedPulling="2025-12-03 19:48:36.466818511 +0000 UTC m=+3237.065412975" observedRunningTime="2025-12-03 19:48:36.994340436 +0000 UTC m=+3237.592934900" watchObservedRunningTime="2025-12-03 19:48:37.001550538 +0000 UTC m=+3237.600145002" Dec 03 19:48:40 crc kubenswrapper[4731]: I1203 19:48:40.513674 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-td96h_a015a221-3196-466d-b58f-79a0b04104ec/cert-manager-controller/0.log" Dec 03 19:48:40 crc kubenswrapper[4731]: I1203 19:48:40.571681 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-ffkkn_670ef28e-2fa7-479f-9d0c-65164095dda5/cert-manager-cainjector/0.log" Dec 03 19:48:40 crc kubenswrapper[4731]: I1203 19:48:40.679038 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2vfp9_50c20406-8225-4008-a120-1e075514ef8d/cert-manager-webhook/0.log" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.478420 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.479119 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.533469 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.736852 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.736937 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:42 crc kubenswrapper[4731]: I1203 19:48:42.793857 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:43 crc kubenswrapper[4731]: I1203 19:48:43.093542 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:43 crc kubenswrapper[4731]: I1203 19:48:43.106700 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:44 crc kubenswrapper[4731]: I1203 19:48:44.522719 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.057124 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrgcb" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="registry-server" containerID="cri-o://3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7" gracePeriod=2 Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.528069 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.528891 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6264h" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="registry-server" containerID="cri-o://17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa" gracePeriod=2 Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.560742 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.591321 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content\") pod \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.591426 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities\") pod \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.591601 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8x4c\" (UniqueName: \"kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c\") pod \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\" (UID: \"2ed333e0-9070-45fd-a850-395ab3d8f8ed\") " Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.592582 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities" (OuterVolumeSpecName: "utilities") pod "2ed333e0-9070-45fd-a850-395ab3d8f8ed" (UID: "2ed333e0-9070-45fd-a850-395ab3d8f8ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.601841 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c" (OuterVolumeSpecName: "kube-api-access-d8x4c") pod "2ed333e0-9070-45fd-a850-395ab3d8f8ed" (UID: "2ed333e0-9070-45fd-a850-395ab3d8f8ed"). InnerVolumeSpecName "kube-api-access-d8x4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.661354 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ed333e0-9070-45fd-a850-395ab3d8f8ed" (UID: "2ed333e0-9070-45fd-a850-395ab3d8f8ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.694071 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.694126 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8x4c\" (UniqueName: \"kubernetes.io/projected/2ed333e0-9070-45fd-a850-395ab3d8f8ed-kube-api-access-d8x4c\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.694145 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed333e0-9070-45fd-a850-395ab3d8f8ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:45 crc kubenswrapper[4731]: I1203 19:48:45.946199 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.000028 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities" (OuterVolumeSpecName: "utilities") pod "ef49f983-af7d-46ea-8196-109683380e89" (UID: "ef49f983-af7d-46ea-8196-109683380e89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.000082 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities\") pod \"ef49f983-af7d-46ea-8196-109683380e89\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.000116 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content\") pod \"ef49f983-af7d-46ea-8196-109683380e89\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.000199 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncfpd\" (UniqueName: \"kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd\") pod \"ef49f983-af7d-46ea-8196-109683380e89\" (UID: \"ef49f983-af7d-46ea-8196-109683380e89\") " Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.001149 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.011872 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd" (OuterVolumeSpecName: "kube-api-access-ncfpd") pod "ef49f983-af7d-46ea-8196-109683380e89" (UID: "ef49f983-af7d-46ea-8196-109683380e89"). InnerVolumeSpecName "kube-api-access-ncfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.026356 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef49f983-af7d-46ea-8196-109683380e89" (UID: "ef49f983-af7d-46ea-8196-109683380e89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.070699 4731 generic.go:334] "Generic (PLEG): container finished" podID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerID="3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7" exitCode=0 Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.070803 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerDied","Data":"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7"} Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.070840 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrgcb" event={"ID":"2ed333e0-9070-45fd-a850-395ab3d8f8ed","Type":"ContainerDied","Data":"4d7bf741e216328e704cf830fabdd7a628dcc083907186c509963c23d4c34262"} Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.070875 4731 scope.go:117] "RemoveContainer" containerID="3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.071069 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrgcb" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.084941 4731 generic.go:334] "Generic (PLEG): container finished" podID="ef49f983-af7d-46ea-8196-109683380e89" containerID="17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa" exitCode=0 Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.085001 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerDied","Data":"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa"} Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.085037 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6264h" event={"ID":"ef49f983-af7d-46ea-8196-109683380e89","Type":"ContainerDied","Data":"2c2ff7f4cd5cde9dc4e30ff2162f8c1ad79f146d0817bac066d2b3f2a19bc3a0"} Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.085146 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6264h" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.103139 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef49f983-af7d-46ea-8196-109683380e89-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.103194 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncfpd\" (UniqueName: \"kubernetes.io/projected/ef49f983-af7d-46ea-8196-109683380e89-kube-api-access-ncfpd\") on node \"crc\" DevicePath \"\"" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.108201 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.119834 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrgcb"] Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.125709 4731 scope.go:117] "RemoveContainer" containerID="a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.138532 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.145244 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6264h"] Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.148082 4731 scope.go:117] "RemoveContainer" containerID="69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.166696 4731 scope.go:117] "RemoveContainer" containerID="3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.167306 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7\": container with ID starting with 3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7 not found: ID does not exist" containerID="3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.167376 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7"} err="failed to get container status \"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7\": rpc error: code = NotFound desc = could not find container \"3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7\": container with ID starting with 3a0a3a379fb7da3e7348abfeebf2da5c11881f4a9351c4e4b09faf66dc9760f7 not found: ID does not exist" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.167410 4731 scope.go:117] "RemoveContainer" containerID="a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.167943 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3\": container with ID starting with a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3 not found: ID does not exist" containerID="a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.167984 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3"} err="failed to get container status \"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3\": rpc error: code = NotFound desc = could not find container \"a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3\": container with ID starting with a016e4deb1e05bd955bb672623531f4e2b5d015e7f544f7f48ff9633fba188f3 not found: ID does not exist" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.168010 4731 scope.go:117] "RemoveContainer" containerID="69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.168356 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036\": container with ID starting with 69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036 not found: ID does not exist" containerID="69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.168415 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036"} err="failed to get container status \"69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036\": rpc error: code = NotFound desc = could not find container \"69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036\": container with ID starting with 69b1f1a3a31921b35cc42aee5217741c8645aae600c0f02b53d148a8c623d036 not found: ID does not exist" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.168466 4731 scope.go:117] "RemoveContainer" containerID="17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.231355 4731 scope.go:117] "RemoveContainer" containerID="c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.252796 4731 scope.go:117] "RemoveContainer" containerID="0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.321096 4731 scope.go:117] "RemoveContainer" containerID="17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.321822 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa\": container with ID starting with 17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa not found: ID does not exist" containerID="17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.321889 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa"} err="failed to get container status \"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa\": rpc error: code = NotFound desc = could not find container \"17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa\": container with ID starting with 17a7951365d4985733d66ba0d68ee6c293c4149a16f84f96a8d41289d6ac2eaa not found: ID does not exist" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.321930 4731 scope.go:117] "RemoveContainer" containerID="c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.328950 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c\": container with ID starting with c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c not found: ID does not exist" containerID="c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.329542 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c"} err="failed to get container status \"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c\": rpc error: code = NotFound desc = could not find container \"c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c\": container with ID starting with c0fde29894b14b17dcaf78d5d1045b562bcee0a7dd3e0c1e488470b181d75b3c not found: ID does not exist" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.329589 4731 scope.go:117] "RemoveContainer" containerID="0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a" Dec 03 19:48:46 crc kubenswrapper[4731]: E1203 19:48:46.335149 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a\": container with ID starting with 0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a not found: ID does not exist" containerID="0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a" Dec 03 19:48:46 crc kubenswrapper[4731]: I1203 19:48:46.335371 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a"} err="failed to get container status \"0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a\": rpc error: code = NotFound desc = could not find container \"0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a\": container with ID starting with 0cbdac74796406b9354e2500f6a0014fc9cdcbfa75fd9250d62e990047307d7a not found: ID does not exist" Dec 03 19:48:47 crc kubenswrapper[4731]: I1203 19:48:47.869060 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" path="/var/lib/kubelet/pods/2ed333e0-9070-45fd-a850-395ab3d8f8ed/volumes" Dec 03 19:48:47 crc kubenswrapper[4731]: I1203 19:48:47.870743 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef49f983-af7d-46ea-8196-109683380e89" path="/var/lib/kubelet/pods/ef49f983-af7d-46ea-8196-109683380e89/volumes" Dec 03 19:48:54 crc kubenswrapper[4731]: I1203 19:48:54.809043 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-7cf69_afac9508-23d7-4b28-a52b-c6bf555cc02a/nmstate-console-plugin/0.log" Dec 03 19:48:55 crc kubenswrapper[4731]: I1203 19:48:55.046937 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-g4tbf_85c23854-5afa-4083-acdb-da40a631204b/nmstate-handler/0.log" Dec 03 19:48:55 crc kubenswrapper[4731]: I1203 19:48:55.094177 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9qx9k_950127c6-7145-4075-9956-2922dcfb6d9a/kube-rbac-proxy/0.log" Dec 03 19:48:55 crc kubenswrapper[4731]: I1203 19:48:55.200341 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9qx9k_950127c6-7145-4075-9956-2922dcfb6d9a/nmstate-metrics/0.log" Dec 03 19:48:55 crc kubenswrapper[4731]: I1203 19:48:55.357607 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-mg7cg_75cd2a87-eba6-4d06-a66c-3740a62f7496/nmstate-operator/0.log" Dec 03 19:48:55 crc kubenswrapper[4731]: I1203 19:48:55.463500 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-gpff6_52d46f33-0c4f-403f-a207-8ebb320e6c0d/nmstate-webhook/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.463712 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-rs9fx_a0a92812-f3d1-4c8b-816d-034b6cdc1438/kube-rbac-proxy/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.666858 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-rs9fx_a0a92812-f3d1-4c8b-816d-034b6cdc1438/controller/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.751245 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-frr-files/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.923405 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-reloader/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.929934 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-metrics/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.930296 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-frr-files/0.log" Dec 03 19:49:10 crc kubenswrapper[4731]: I1203 19:49:10.967503 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-reloader/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.244793 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-reloader/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.256586 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-metrics/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.284217 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-frr-files/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.292889 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-metrics/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.501553 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-frr-files/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.501976 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-reloader/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.548384 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/cp-metrics/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.554362 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/controller/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.769601 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/frr-metrics/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.791588 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/kube-rbac-proxy/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.804469 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/kube-rbac-proxy-frr/0.log" Dec 03 19:49:11 crc kubenswrapper[4731]: I1203 19:49:11.996898 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-s9r7r_e655adda-c328-44ce-a95e-7cfe44ce671c/frr-k8s-webhook-server/0.log" Dec 03 19:49:12 crc kubenswrapper[4731]: I1203 19:49:12.047071 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/reloader/0.log" Dec 03 19:49:12 crc kubenswrapper[4731]: I1203 19:49:12.341118 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7686ff65d7-9nhhv_6a724a74-ca11-42ef-8a1f-96665b3a6773/manager/0.log" Dec 03 19:49:12 crc kubenswrapper[4731]: I1203 19:49:12.405569 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79695f5758-49qdw_2255151e-dced-4ba0-8329-89984ef8583d/webhook-server/0.log" Dec 03 19:49:12 crc kubenswrapper[4731]: I1203 19:49:12.711214 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8zkk6_ce748c4e-5fb2-4792-bd00-3294f9d85144/kube-rbac-proxy/0.log" Dec 03 19:49:13 crc kubenswrapper[4731]: I1203 19:49:13.038669 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vxn82_974efcbe-f04c-46e1-b9d6-4cd2a537db71/frr/0.log" Dec 03 19:49:13 crc kubenswrapper[4731]: I1203 19:49:13.111975 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8zkk6_ce748c4e-5fb2-4792-bd00-3294f9d85144/speaker/0.log" Dec 03 19:49:25 crc kubenswrapper[4731]: I1203 19:49:25.893163 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.019932 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.026427 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/pull/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.066792 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/pull/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.241406 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/pull/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.243990 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.296797 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnqxxj_0a940e33-2b46-4dd2-9df7-94c8217c5969/extract/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.414564 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.617727 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.631694 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/pull/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.651530 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/pull/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.899432 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/util/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.943234 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/extract/0.log" Dec 03 19:49:26 crc kubenswrapper[4731]: I1203 19:49:26.952436 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q5rl5_e820b8d7-cbff-4738-9618-0b1744a2bd9c/pull/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.113675 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-utilities/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.281862 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-content/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.283988 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-utilities/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.317325 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-content/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.461577 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-utilities/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.508585 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/extract-content/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.740274 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-utilities/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.973310 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-utilities/0.log" Dec 03 19:49:27 crc kubenswrapper[4731]: I1203 19:49:27.983139 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ljts9_87bd198b-ff22-4e12-86ab-dfb52adbe31c/registry-server/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.034959 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-content/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.063486 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-content/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.204974 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-content/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.247084 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/extract-utilities/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.412447 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bzsv6_58c5709d-2320-4da3-a897-bf4289ed68ee/marketplace-operator/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.550693 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-utilities/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.765330 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-utilities/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.809346 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sf62s_a94f173d-5304-4cd9-bdfc-2dfb032b154c/registry-server/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.831768 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-content/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.875870 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-content/0.log" Dec 03 19:49:28 crc kubenswrapper[4731]: I1203 19:49:28.967479 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-utilities/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.001422 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/extract-content/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.198642 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q7flg_cbb9f0c2-c760-4f02-81d0-37194af5c296/registry-server/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.248403 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-utilities/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.361452 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-utilities/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.441140 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-content/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.445055 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-content/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.589981 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-utilities/0.log" Dec 03 19:49:29 crc kubenswrapper[4731]: I1203 19:49:29.618636 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/extract-content/0.log" Dec 03 19:49:30 crc kubenswrapper[4731]: I1203 19:49:30.190677 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bm84h_5740f025-332a-4be3-8473-ec656326c634/registry-server/0.log" Dec 03 19:49:56 crc kubenswrapper[4731]: I1203 19:49:56.468364 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:49:56 crc kubenswrapper[4731]: I1203 19:49:56.469057 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.083452 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084285 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="extract-content" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084301 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="extract-content" Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084347 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084355 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084372 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="extract-utilities" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084378 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="extract-utilities" Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084401 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084408 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084417 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="extract-utilities" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084423 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="extract-utilities" Dec 03 19:50:20 crc kubenswrapper[4731]: E1203 19:50:20.084436 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="extract-content" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084443 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="extract-content" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084673 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed333e0-9070-45fd-a850-395ab3d8f8ed" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.084686 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef49f983-af7d-46ea-8196-109683380e89" containerName="registry-server" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.086616 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.112546 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.168531 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.168729 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22ds\" (UniqueName: \"kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.168788 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.269828 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.270014 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.270110 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22ds\" (UniqueName: \"kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.270519 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.270591 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.293158 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22ds\" (UniqueName: \"kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds\") pod \"certified-operators-zqqt5\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:20 crc kubenswrapper[4731]: I1203 19:50:20.411346 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:21 crc kubenswrapper[4731]: I1203 19:50:21.021075 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:21 crc kubenswrapper[4731]: I1203 19:50:21.204954 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerStarted","Data":"94e74f071f2c5299a377486190329b37a82a78aec9a80ac162cc200dd8a59726"} Dec 03 19:50:22 crc kubenswrapper[4731]: I1203 19:50:22.247667 4731 generic.go:334] "Generic (PLEG): container finished" podID="76ba7335-e6e9-4af2-bb77-368eab015637" containerID="cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640" exitCode=0 Dec 03 19:50:22 crc kubenswrapper[4731]: I1203 19:50:22.248095 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerDied","Data":"cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640"} Dec 03 19:50:22 crc kubenswrapper[4731]: I1203 19:50:22.251140 4731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 19:50:23 crc kubenswrapper[4731]: I1203 19:50:23.260831 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerStarted","Data":"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf"} Dec 03 19:50:24 crc kubenswrapper[4731]: I1203 19:50:24.277142 4731 generic.go:334] "Generic (PLEG): container finished" podID="76ba7335-e6e9-4af2-bb77-368eab015637" containerID="e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf" exitCode=0 Dec 03 19:50:24 crc kubenswrapper[4731]: I1203 19:50:24.277215 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerDied","Data":"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf"} Dec 03 19:50:25 crc kubenswrapper[4731]: I1203 19:50:25.295579 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerStarted","Data":"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855"} Dec 03 19:50:25 crc kubenswrapper[4731]: I1203 19:50:25.328505 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zqqt5" podStartSLOduration=2.874854254 podStartE2EDuration="5.328483417s" podCreationTimestamp="2025-12-03 19:50:20 +0000 UTC" firstStartedPulling="2025-12-03 19:50:22.250758886 +0000 UTC m=+3342.849353390" lastFinishedPulling="2025-12-03 19:50:24.704388049 +0000 UTC m=+3345.302982553" observedRunningTime="2025-12-03 19:50:25.32598635 +0000 UTC m=+3345.924580814" watchObservedRunningTime="2025-12-03 19:50:25.328483417 +0000 UTC m=+3345.927077891" Dec 03 19:50:26 crc kubenswrapper[4731]: I1203 19:50:26.468799 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:50:26 crc kubenswrapper[4731]: I1203 19:50:26.469120 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:50:30 crc kubenswrapper[4731]: I1203 19:50:30.412383 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:30 crc kubenswrapper[4731]: I1203 19:50:30.412931 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:30 crc kubenswrapper[4731]: I1203 19:50:30.470874 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:31 crc kubenswrapper[4731]: I1203 19:50:31.399211 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:31 crc kubenswrapper[4731]: I1203 19:50:31.459773 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:33 crc kubenswrapper[4731]: I1203 19:50:33.371135 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zqqt5" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="registry-server" containerID="cri-o://60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855" gracePeriod=2 Dec 03 19:50:33 crc kubenswrapper[4731]: I1203 19:50:33.981553 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.102960 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities\") pod \"76ba7335-e6e9-4af2-bb77-368eab015637\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.103473 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22ds\" (UniqueName: \"kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds\") pod \"76ba7335-e6e9-4af2-bb77-368eab015637\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.103654 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content\") pod \"76ba7335-e6e9-4af2-bb77-368eab015637\" (UID: \"76ba7335-e6e9-4af2-bb77-368eab015637\") " Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.103856 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities" (OuterVolumeSpecName: "utilities") pod "76ba7335-e6e9-4af2-bb77-368eab015637" (UID: "76ba7335-e6e9-4af2-bb77-368eab015637"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.104345 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.109021 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds" (OuterVolumeSpecName: "kube-api-access-w22ds") pod "76ba7335-e6e9-4af2-bb77-368eab015637" (UID: "76ba7335-e6e9-4af2-bb77-368eab015637"). InnerVolumeSpecName "kube-api-access-w22ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.191864 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76ba7335-e6e9-4af2-bb77-368eab015637" (UID: "76ba7335-e6e9-4af2-bb77-368eab015637"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.206829 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22ds\" (UniqueName: \"kubernetes.io/projected/76ba7335-e6e9-4af2-bb77-368eab015637-kube-api-access-w22ds\") on node \"crc\" DevicePath \"\"" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.207157 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ba7335-e6e9-4af2-bb77-368eab015637-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.397082 4731 generic.go:334] "Generic (PLEG): container finished" podID="76ba7335-e6e9-4af2-bb77-368eab015637" containerID="60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855" exitCode=0 Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.397157 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqqt5" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.397967 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerDied","Data":"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855"} Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.398087 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqqt5" event={"ID":"76ba7335-e6e9-4af2-bb77-368eab015637","Type":"ContainerDied","Data":"94e74f071f2c5299a377486190329b37a82a78aec9a80ac162cc200dd8a59726"} Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.398124 4731 scope.go:117] "RemoveContainer" containerID="60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.429534 4731 scope.go:117] "RemoveContainer" containerID="e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.432825 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.447571 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zqqt5"] Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.457574 4731 scope.go:117] "RemoveContainer" containerID="cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.503412 4731 scope.go:117] "RemoveContainer" containerID="60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855" Dec 03 19:50:34 crc kubenswrapper[4731]: E1203 19:50:34.504013 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855\": container with ID starting with 60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855 not found: ID does not exist" containerID="60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.504080 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855"} err="failed to get container status \"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855\": rpc error: code = NotFound desc = could not find container \"60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855\": container with ID starting with 60c09929a12401ca106d78967082e387a2e62b279ecbe0bbe6075937f3029855 not found: ID does not exist" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.504115 4731 scope.go:117] "RemoveContainer" containerID="e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf" Dec 03 19:50:34 crc kubenswrapper[4731]: E1203 19:50:34.504662 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf\": container with ID starting with e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf not found: ID does not exist" containerID="e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.504695 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf"} err="failed to get container status \"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf\": rpc error: code = NotFound desc = could not find container \"e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf\": container with ID starting with e36bc85e432b818dab85593add2979c58f51ebeb2caece05f6dc80e11b9c9abf not found: ID does not exist" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.504713 4731 scope.go:117] "RemoveContainer" containerID="cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640" Dec 03 19:50:34 crc kubenswrapper[4731]: E1203 19:50:34.505028 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640\": container with ID starting with cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640 not found: ID does not exist" containerID="cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640" Dec 03 19:50:34 crc kubenswrapper[4731]: I1203 19:50:34.505115 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640"} err="failed to get container status \"cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640\": rpc error: code = NotFound desc = could not find container \"cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640\": container with ID starting with cc5352368a211cab37f240732f9bc42f3becd18d6f7eee41c0db6f434f33c640 not found: ID does not exist" Dec 03 19:50:35 crc kubenswrapper[4731]: I1203 19:50:35.868485 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" path="/var/lib/kubelet/pods/76ba7335-e6e9-4af2-bb77-368eab015637/volumes" Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.468721 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.469284 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.469334 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.469843 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.469892 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a" gracePeriod=600 Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.632509 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a" exitCode=0 Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.632615 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a"} Dec 03 19:50:56 crc kubenswrapper[4731]: I1203 19:50:56.632891 4731 scope.go:117] "RemoveContainer" containerID="4d50249ddd6d67add2fb8375a74f7a204f2b3c02818c0eb661d593a1fd6edc48" Dec 03 19:50:57 crc kubenswrapper[4731]: I1203 19:50:57.643287 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerStarted","Data":"99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7"} Dec 03 19:51:14 crc kubenswrapper[4731]: I1203 19:51:14.842711 4731 generic.go:334] "Generic (PLEG): container finished" podID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerID="01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d" exitCode=0 Dec 03 19:51:14 crc kubenswrapper[4731]: I1203 19:51:14.842813 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4sxz/must-gather-sfszb" event={"ID":"e9c78747-48e2-4e90-a7ad-4c624da161ad","Type":"ContainerDied","Data":"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d"} Dec 03 19:51:14 crc kubenswrapper[4731]: I1203 19:51:14.844002 4731 scope.go:117] "RemoveContainer" containerID="01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d" Dec 03 19:51:15 crc kubenswrapper[4731]: I1203 19:51:15.293492 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4sxz_must-gather-sfszb_e9c78747-48e2-4e90-a7ad-4c624da161ad/gather/0.log" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.303966 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k4sxz/must-gather-sfszb"] Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.304792 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k4sxz/must-gather-sfszb" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="copy" containerID="cri-o://12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f" gracePeriod=2 Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.311960 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k4sxz/must-gather-sfszb"] Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.748315 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4sxz_must-gather-sfszb_e9c78747-48e2-4e90-a7ad-4c624da161ad/copy/0.log" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.749133 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.937566 4731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4sxz_must-gather-sfszb_e9c78747-48e2-4e90-a7ad-4c624da161ad/copy/0.log" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.938396 4731 generic.go:334] "Generic (PLEG): container finished" podID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerID="12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f" exitCode=143 Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.938467 4731 scope.go:117] "RemoveContainer" containerID="12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.938472 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4sxz/must-gather-sfszb" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.945564 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk2jm\" (UniqueName: \"kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm\") pod \"e9c78747-48e2-4e90-a7ad-4c624da161ad\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.946114 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output\") pod \"e9c78747-48e2-4e90-a7ad-4c624da161ad\" (UID: \"e9c78747-48e2-4e90-a7ad-4c624da161ad\") " Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.964030 4731 scope.go:117] "RemoveContainer" containerID="01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d" Dec 03 19:51:23 crc kubenswrapper[4731]: I1203 19:51:23.972560 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm" (OuterVolumeSpecName: "kube-api-access-hk2jm") pod "e9c78747-48e2-4e90-a7ad-4c624da161ad" (UID: "e9c78747-48e2-4e90-a7ad-4c624da161ad"). InnerVolumeSpecName "kube-api-access-hk2jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.071313 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk2jm\" (UniqueName: \"kubernetes.io/projected/e9c78747-48e2-4e90-a7ad-4c624da161ad-kube-api-access-hk2jm\") on node \"crc\" DevicePath \"\"" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.096122 4731 scope.go:117] "RemoveContainer" containerID="12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f" Dec 03 19:51:24 crc kubenswrapper[4731]: E1203 19:51:24.096760 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f\": container with ID starting with 12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f not found: ID does not exist" containerID="12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.096814 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f"} err="failed to get container status \"12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f\": rpc error: code = NotFound desc = could not find container \"12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f\": container with ID starting with 12c0cc1fb6caa00846a052b5ab7dd1b128361d64d5b68a529b0f789da0982a5f not found: ID does not exist" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.096845 4731 scope.go:117] "RemoveContainer" containerID="01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d" Dec 03 19:51:24 crc kubenswrapper[4731]: E1203 19:51:24.097333 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d\": container with ID starting with 01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d not found: ID does not exist" containerID="01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.097365 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d"} err="failed to get container status \"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d\": rpc error: code = NotFound desc = could not find container \"01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d\": container with ID starting with 01646daaa7e82e5840d773807ee00839686eeba71d6374d4a391441c527cf50d not found: ID does not exist" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.114480 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e9c78747-48e2-4e90-a7ad-4c624da161ad" (UID: "e9c78747-48e2-4e90-a7ad-4c624da161ad"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:51:24 crc kubenswrapper[4731]: I1203 19:51:24.173946 4731 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9c78747-48e2-4e90-a7ad-4c624da161ad-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 19:51:25 crc kubenswrapper[4731]: I1203 19:51:25.871184 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" path="/var/lib/kubelet/pods/e9c78747-48e2-4e90-a7ad-4c624da161ad/volumes" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.775395 4731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:10 crc kubenswrapper[4731]: E1203 19:52:10.776369 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="gather" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776384 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="gather" Dec 03 19:52:10 crc kubenswrapper[4731]: E1203 19:52:10.776397 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="copy" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776403 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="copy" Dec 03 19:52:10 crc kubenswrapper[4731]: E1203 19:52:10.776413 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="extract-content" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776419 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="extract-content" Dec 03 19:52:10 crc kubenswrapper[4731]: E1203 19:52:10.776432 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="extract-utilities" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776438 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="extract-utilities" Dec 03 19:52:10 crc kubenswrapper[4731]: E1203 19:52:10.776455 4731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="registry-server" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776461 4731 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="registry-server" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776654 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ba7335-e6e9-4af2-bb77-368eab015637" containerName="registry-server" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776677 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="gather" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.776688 4731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c78747-48e2-4e90-a7ad-4c624da161ad" containerName="copy" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.778111 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.820017 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.879686 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6ll7\" (UniqueName: \"kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.879876 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.880093 4731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.982433 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.982581 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6ll7\" (UniqueName: \"kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.982636 4731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.983568 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:10 crc kubenswrapper[4731]: I1203 19:52:10.983587 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:11 crc kubenswrapper[4731]: I1203 19:52:11.012674 4731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6ll7\" (UniqueName: \"kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7\") pod \"redhat-operators-xvndf\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:11 crc kubenswrapper[4731]: I1203 19:52:11.107628 4731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:11 crc kubenswrapper[4731]: I1203 19:52:11.619718 4731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:12 crc kubenswrapper[4731]: I1203 19:52:12.448587 4731 generic.go:334] "Generic (PLEG): container finished" podID="fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" containerID="88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe" exitCode=0 Dec 03 19:52:12 crc kubenswrapper[4731]: I1203 19:52:12.448869 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerDied","Data":"88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe"} Dec 03 19:52:12 crc kubenswrapper[4731]: I1203 19:52:12.448905 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerStarted","Data":"cc2c4e9d575be693b5a7007b3047ddc8837836740b3ceac840ca6f0fb8b24f81"} Dec 03 19:52:14 crc kubenswrapper[4731]: I1203 19:52:14.469415 4731 generic.go:334] "Generic (PLEG): container finished" podID="fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" containerID="10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee" exitCode=0 Dec 03 19:52:14 crc kubenswrapper[4731]: I1203 19:52:14.469505 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerDied","Data":"10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee"} Dec 03 19:52:15 crc kubenswrapper[4731]: I1203 19:52:15.482168 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerStarted","Data":"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281"} Dec 03 19:52:15 crc kubenswrapper[4731]: I1203 19:52:15.509707 4731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xvndf" podStartSLOduration=2.965650284 podStartE2EDuration="5.509679571s" podCreationTimestamp="2025-12-03 19:52:10 +0000 UTC" firstStartedPulling="2025-12-03 19:52:12.45078937 +0000 UTC m=+3453.049383834" lastFinishedPulling="2025-12-03 19:52:14.994818657 +0000 UTC m=+3455.593413121" observedRunningTime="2025-12-03 19:52:15.499720211 +0000 UTC m=+3456.098314685" watchObservedRunningTime="2025-12-03 19:52:15.509679571 +0000 UTC m=+3456.108274035" Dec 03 19:52:21 crc kubenswrapper[4731]: I1203 19:52:21.108104 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:21 crc kubenswrapper[4731]: I1203 19:52:21.108792 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:21 crc kubenswrapper[4731]: I1203 19:52:21.166648 4731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:21 crc kubenswrapper[4731]: I1203 19:52:21.604521 4731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:21 crc kubenswrapper[4731]: I1203 19:52:21.655043 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:23 crc kubenswrapper[4731]: I1203 19:52:23.552114 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xvndf" podUID="fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" containerName="registry-server" containerID="cri-o://863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281" gracePeriod=2 Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.039535 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.142830 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content\") pod \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.142964 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities\") pod \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.143107 4731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6ll7\" (UniqueName: \"kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7\") pod \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\" (UID: \"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5\") " Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.144005 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities" (OuterVolumeSpecName: "utilities") pod "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" (UID: "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.145563 4731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.148390 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7" (OuterVolumeSpecName: "kube-api-access-s6ll7") pod "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" (UID: "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5"). InnerVolumeSpecName "kube-api-access-s6ll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.246973 4731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6ll7\" (UniqueName: \"kubernetes.io/projected/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-kube-api-access-s6ll7\") on node \"crc\" DevicePath \"\"" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.562927 4731 generic.go:334] "Generic (PLEG): container finished" podID="fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" containerID="863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281" exitCode=0 Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.563062 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerDied","Data":"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281"} Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.563309 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvndf" event={"ID":"fb787f52-b74f-4793-baa0-1cc3f2e5a0e5","Type":"ContainerDied","Data":"cc2c4e9d575be693b5a7007b3047ddc8837836740b3ceac840ca6f0fb8b24f81"} Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.563348 4731 scope.go:117] "RemoveContainer" containerID="863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.563122 4731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvndf" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.584310 4731 scope.go:117] "RemoveContainer" containerID="10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.604712 4731 scope.go:117] "RemoveContainer" containerID="88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.647569 4731 scope.go:117] "RemoveContainer" containerID="863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281" Dec 03 19:52:24 crc kubenswrapper[4731]: E1203 19:52:24.648549 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281\": container with ID starting with 863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281 not found: ID does not exist" containerID="863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.648641 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281"} err="failed to get container status \"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281\": rpc error: code = NotFound desc = could not find container \"863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281\": container with ID starting with 863490d8db0830862548b00cd8d6248daffd9856f5314db18e0f3b4bb190d281 not found: ID does not exist" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.648666 4731 scope.go:117] "RemoveContainer" containerID="10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee" Dec 03 19:52:24 crc kubenswrapper[4731]: E1203 19:52:24.649096 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee\": container with ID starting with 10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee not found: ID does not exist" containerID="10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.649359 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee"} err="failed to get container status \"10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee\": rpc error: code = NotFound desc = could not find container \"10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee\": container with ID starting with 10e9722094b70ef5dba19f4ae247c365507ab58b86c3499a52370c10a8a52dee not found: ID does not exist" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.649540 4731 scope.go:117] "RemoveContainer" containerID="88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe" Dec 03 19:52:24 crc kubenswrapper[4731]: E1203 19:52:24.650155 4731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe\": container with ID starting with 88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe not found: ID does not exist" containerID="88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe" Dec 03 19:52:24 crc kubenswrapper[4731]: I1203 19:52:24.650189 4731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe"} err="failed to get container status \"88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe\": rpc error: code = NotFound desc = could not find container \"88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe\": container with ID starting with 88d417dc670cae0ee8e19e1405a0c4b1809e0c6703e1261dbf9a43e8097cd0fe not found: ID does not exist" Dec 03 19:52:25 crc kubenswrapper[4731]: I1203 19:52:25.452448 4731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" (UID: "fb787f52-b74f-4793-baa0-1cc3f2e5a0e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 19:52:25 crc kubenswrapper[4731]: I1203 19:52:25.493998 4731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 19:52:25 crc kubenswrapper[4731]: I1203 19:52:25.497888 4731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:25 crc kubenswrapper[4731]: I1203 19:52:25.507516 4731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xvndf"] Dec 03 19:52:25 crc kubenswrapper[4731]: I1203 19:52:25.868291 4731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb787f52-b74f-4793-baa0-1cc3f2e5a0e5" path="/var/lib/kubelet/pods/fb787f52-b74f-4793-baa0-1cc3f2e5a0e5/volumes" Dec 03 19:52:56 crc kubenswrapper[4731]: I1203 19:52:56.469145 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:52:56 crc kubenswrapper[4731]: I1203 19:52:56.470575 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:53:26 crc kubenswrapper[4731]: I1203 19:53:26.469385 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:53:26 crc kubenswrapper[4731]: I1203 19:53:26.470045 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:53:56 crc kubenswrapper[4731]: I1203 19:53:56.469084 4731 patch_prober.go:28] interesting pod/machine-config-daemon-mmjcd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 19:53:56 crc kubenswrapper[4731]: I1203 19:53:56.469835 4731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 19:53:56 crc kubenswrapper[4731]: I1203 19:53:56.469919 4731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" Dec 03 19:53:56 crc kubenswrapper[4731]: I1203 19:53:56.470979 4731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7"} pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 19:53:56 crc kubenswrapper[4731]: I1203 19:53:56.471074 4731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerName="machine-config-daemon" containerID="cri-o://99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" gracePeriod=600 Dec 03 19:53:56 crc kubenswrapper[4731]: E1203 19:53:56.596519 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:53:57 crc kubenswrapper[4731]: I1203 19:53:57.458924 4731 generic.go:334] "Generic (PLEG): container finished" podID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" exitCode=0 Dec 03 19:53:57 crc kubenswrapper[4731]: I1203 19:53:57.459325 4731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" event={"ID":"95dced4d-3fd5-43d3-b87d-21ec9c80de8b","Type":"ContainerDied","Data":"99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7"} Dec 03 19:53:57 crc kubenswrapper[4731]: I1203 19:53:57.459383 4731 scope.go:117] "RemoveContainer" containerID="7aa9760dbe2905d27052527c087aa5641325fe330e7320b8eb33222275cbfb7a" Dec 03 19:53:57 crc kubenswrapper[4731]: I1203 19:53:57.460329 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:53:57 crc kubenswrapper[4731]: E1203 19:53:57.460701 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:54:12 crc kubenswrapper[4731]: I1203 19:54:12.856222 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:54:12 crc kubenswrapper[4731]: E1203 19:54:12.857324 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:54:27 crc kubenswrapper[4731]: I1203 19:54:27.857144 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:54:27 crc kubenswrapper[4731]: E1203 19:54:27.857842 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:54:38 crc kubenswrapper[4731]: I1203 19:54:38.856174 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:54:38 crc kubenswrapper[4731]: E1203 19:54:38.858268 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:54:50 crc kubenswrapper[4731]: I1203 19:54:50.856624 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:54:50 crc kubenswrapper[4731]: E1203 19:54:50.857519 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:55:01 crc kubenswrapper[4731]: I1203 19:55:01.856533 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:55:01 crc kubenswrapper[4731]: E1203 19:55:01.857340 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:55:12 crc kubenswrapper[4731]: I1203 19:55:12.857000 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:55:12 crc kubenswrapper[4731]: E1203 19:55:12.857856 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:55:24 crc kubenswrapper[4731]: I1203 19:55:24.857642 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:55:24 crc kubenswrapper[4731]: E1203 19:55:24.858395 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:55:37 crc kubenswrapper[4731]: I1203 19:55:37.856619 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:55:37 crc kubenswrapper[4731]: E1203 19:55:37.857237 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:55:49 crc kubenswrapper[4731]: I1203 19:55:49.867419 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:55:49 crc kubenswrapper[4731]: E1203 19:55:49.868035 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:56:00 crc kubenswrapper[4731]: I1203 19:56:00.856724 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:56:00 crc kubenswrapper[4731]: E1203 19:56:00.858936 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:56:15 crc kubenswrapper[4731]: I1203 19:56:15.856052 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:56:15 crc kubenswrapper[4731]: E1203 19:56:15.856799 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b" Dec 03 19:56:29 crc kubenswrapper[4731]: I1203 19:56:29.864355 4731 scope.go:117] "RemoveContainer" containerID="99363310358e1006a7683628063d73911a7ba0ee935cf1f8b519cd7f1e8ac8d7" Dec 03 19:56:29 crc kubenswrapper[4731]: E1203 19:56:29.865274 4731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mmjcd_openshift-machine-config-operator(95dced4d-3fd5-43d3-b87d-21ec9c80de8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-mmjcd" podUID="95dced4d-3fd5-43d3-b87d-21ec9c80de8b"